Recent News

CeBIT 2017: CISPA Exhibits 7 Selected Research Projects

April 5, 2017

CISPA presented a selection of their research projects at this year’s CeBIT. Due to the continuing growth of the Saarbruecken Center for IT-Security, the projects

Read More »

Strong media reaction after spying doll “Cayla” has been forbidden due to privacy concerns raised by S. Hessel (CISPA)

March 1, 2017

Stefan Hessel, a student of CISPA researcher Prof. Sorge, has expressed legal concerns on the children’s doll “My friend Cayla” in his article (in German)

Read More »

CISPA’s Parliamentary Evening in Berlin

February 20, 2017

On February 13, 2017, CISPA organized a Parliamentary Evening on “Autonomous Systems” in Berlin. More than 150 people accepted CISPA’s invitation to “Landesvertretung des Saarlandes”.

Read More »

Events & Distinguished Lectures

Distinguished Lectures (Archive)

Project Group B: Controlling Privacy

Projects in this group focus on the enforcement of privacy properties in applications that play a central role in the daily life of their users.
The projects will deliver a variety of privacy definitions suitable to capture the notion of privacy in different application domains, along with corresponding enforcement
mechanisms.
A common challenge is to resolve the inherent tension between privacy and system functionalities, coming up with privacy notions and enforcement techniques that provide strong guarantees, yet do not hamper performance or usability.
A related challenge is the characterization of attacker models that are strong enough to cover the attack surface available in the various application domains, yet integrate realistic assumptions on the attacker capabilities in order to preserve efficiency and usability.
We will develop innovative techniques to enforce at run-time a wide range of fine-grained security policies on Web applications and mobile devices; devise foundational cryptographic solutions to protect the privacy of user data in a broad spectrum of cloud services, including data sharing applications and computing environments; design a novel secure hardware-assisted platform for privacy-preserving analytics, with the goal of providing users with rigorous privacy guarantees and, at the same time, analysts
with accurate results; develop techniques to enhance the anonymity of users online, preventing tracking and censorship in anonymous communication networks; and design an architecture to protect users’ privacy in the presence of mobile and wearable devices with recording capabilities.
We will develop programming principles and methodologies to support the development of secure and privacy-preserving applications, as well as methods and tools to prove formal privacy guarantees in protocols and systems, making use of cryptography to protect user data.


B1: Privacy Enforcement for Third-Party Software

Bernd Finkbeiner & Christian Hammer

When running third-party software on a device, user privacy is often not honored even by software from trusted entities.
Recent incidents with mobile applications that silently transferred the complete address book (e.g., Twitter or Facebook) to their servers, and a study showing that many of
the top Web sites on the internet leak personal data in an unintended way, illustrate that this threat is real and users have to protect their privacy actively.
We envision a mechanism that allows the user to enforce precise restrictions on the information flow in an application, even if these restrictions are not offered as an option in the application, or may in fact run contrary to the objectives of the (potentially disingenuous) developer of the application.
For example, the policy “Do not transmit any audio before a call was initiated and after the call has been completed” restricts the information flow far more precisely than a standard access control policy.
Information flow policies are naturally characterized using logics for temporal hyperproperties, hence we will identify the enforceable sublogic of HyperCTL*.
We will investigate a wide spectrum of enforcement algorithms for our policies, partition applications according to the principle of least privilege for more efficient enforcement, and evaluate our enforcement mechanisms on representative Web and
mobile apps with complex information flow properties and privacy guarantees.


B2: Programming Principles and Abstractions for Privacy

Deepak Garg & Christian Hammer

Privacy is generally enforced during application deployment and execution, without any feedback from the app developer on how to constrain or modify an app’s behavior if it violates a site-specific privacypolicy.
As a remedy, we propose to build programming language abstractions and programming principles that allow an app developer to enforce privacy by design, taking into account the possibility that the app may have to react to privacy constraints during deployment. We will investigate the well-known but little understood privacy-utility trade-off for applications that are written from-scratch in a policy-aware setting.
We envision a system where app developers structure their code into modules.
Each module provides certain functionality and requires certain permissions:
the more permissive the user’s privacy policies, the more functionality can be provided. Static and dynamic analysis techniques will ensure that the policy on each module and the user’s privacy permissions are enforced.
The system will support gradual app development, where a developer may make her app privacy compliant one module at a time.
We will evaluate our system based on realistic case studies for Web and Android applications.


B3: Computationally Sound Reasoning about Privacy Properties

Michael Backes & Véronique Cortier

Proving security and privacy properties of protocols that rely on cryptographic operations constitutes a highly complex and error-prone task.
As a consequence, the security of such protocols is usually analyzed by replacing cryptographic operations by symbolic abstractions that obey simple
cancellation rules.
However, carrying such analyses over to real-world implementations requires to show that these abstractions can be soundly implemented again using suitable cryptographic
primitives, and existing research approaches fall short of achieving this goal for privacy-sensitive scenarios in many respects: (a) they pertain to restricted programming languages and adversarial capabilities; (b) they fall short in addressing strong secrecy properties as the central requirement in privacy-sensitive applications; and (c) they fail to compose with other results of this form.
The objective of this project is to overcome these limitations, by developing novel methods to enable a computationally sound analysis of privacy properties in modern cryptographic protocols.
This encompasses support for soundly analyzing symbolic abstractions of interactive cryptographic primitives that are at the core of many privacy-protection technologies, and the consideration of extended, privacy-critical adversarial capabilities that extract sensitive information by observing the timing behavior of cryptographic protocols.
Building upon these results, we aim to enable symbolic, yet cryptographically sound reasoning about Oblivious RAM (one of the currently most important interactive privacy-protection technologies).
We will furthermore aim to leverage our soundness results to additional programming languages, in particular to Android’s Dalvik Bytecode.
Finally, we will derive composability guarantees for computational soundness for comprehensive privacy properties based on prior work on composable deduction soundness.


B4: Privacy-Preserving Cloud Storage

Matteo Maffei & Dominique Schröder

Cloud storage has rapidly gained a central role in digital user habitats.
While this comes with tremendous benefits, it incurs privacy threats for data owners (information about whom is stored in the cloud, e.g. patients) as they have little control over the activities of data clients (who access information in the cloud, e.g. hospitals), and it incurs privacy threats for data owners and data clients alike as cloud storage providers can monitor sensitive information.
How to allow fine-grained access control to (encrypted) cloud data while hiding the access patterns from the storage provider?
How to support data clients without revealing individual data owners’ identities?
What are possible trade-offs between efficiency and privacy? To address these questions, we will investigate privacy-preserving outsourced databases in the presence of multiple data owners and clients.
We will formalize appropriate privacy notions, develop cryptographic primitives supporting these notions, and investigate the minimal cryptographic assumptions necessary to instantiate these primitives.
Towards trade-offs, we will design cryptographic protocols with minimal communication complexity and weaker notions of privacy.
We will build a framework for client access through evaluating dedicated functions on (encrypted) data.
We will investigate the application of our techniques to the setting of public logs, where legal authorities request to read the data.


B5: Anonymous and Censorship-Resistant Communication

Matteo Maffei & Christian Rossow

Anonymous communication networks (like Tor and I2P) are essential to protect user privacy.
Despite their popularity, these networks are affected by a fundamental weakness, undermining their functionality: potential adversaries can determine which traffic enters an anonymous communication network, enabling them to censor “undesired” communication.
In such an environment, users have to resort to non-anonymous communication, severely affecting their privacy (and, in case of further censoring activities, their freedom of speech).
This project aims at making anonymous communication networks censorship-resistant: Can we reliably enable censored users to bootstrap anonymous communication?
How to evade address-based filters (such as blacklists)?
How to evade content-based filters (such as signature matching)?
We will devise a decentralized trust metric and privacy-preserving trust networks, in combination with honeypots to identify and track the behavior of censoring adversaries, to solve the problem of anonymous communication bootstrapping.
We will explore machine learning methods to evaluate and harness the potential of
covert communication methods for hiding communication content.
We will integrate our technologies to design practical censorship-resistance plugins for the popular anonymous communication networks Tor and I2P.


B6: Privacy-Friendly Data Analytics

Paul Francis & Matteo Maffei

User data are constantly collected, by various organizations, for the purpose of aggregate analysis.
Despite considering user data in the aggregate only, this threatens user privacy in two fundamental ways: the query result may leak too much information, or the data aggregator itself may leak collected data, intentionally (e.g., selling) or unintentionally (being compromised).
Furthermore, the analyst is often interested in querying joined data collected by different organizations.
How to sanitize query results in ways that resolve the tension between privacy and functionality?
How to securely store and share the user data?
Existing notions of differential privacy address the first question, protecting user privacy, yet providing insufficient utility to be accepted by stakeholders.
We will investigate noiseless and user-centric approaches to provide precise query results and bypass the privacy budget limitation of current differential privacy notions; we will devise cryptographic techniques, based on secure multiparty computation, to allow queries over distributed data in a privacy-preserving manner.
To address the second question, we advocate the use of secure hardware as the basis for protecting user data while allowing for aggregate analytics.
We will devise a hardware-assisted architecture for the privacy-preserving processing of user data, of general applicability and providing privacy guarantees against a realistic attacker model.
We will develop code-hiding property-based attestation techniques to establish trust in such hardware-assisted privacy preserving online services.


B7: Privacy-Preserving Digital Capture

Peter Druschel & Bernt Schiele

The recording capabilities of smart phones and wearable devices pose a serious threat to the privacy of bystanders who are recorded inadvertently without their consent.
It would be impractical for bystanders to voice their preferences to anyone wearing a potential recording device, and equally impractical to impose restrictions on the use of such devises in all relevant areas (airports, bars, …).
We envision a technical solution in which recording devices receive the privacy preferences of nearby users, and enforce these preferences by obfuscating the subjects in question in the recorded media.
We will develop suitable signatures for identifying users in recorded media, strive for accuracy through combining multiple (cross-modal) sources, and strive for energy efficiency by relying on cloud services.
We will investigate suitable user privacy policy specifications, context sensing methods relevant to such policies, and methods for obfuscating subjects prior to releasing the media to applications.
To not endanger the privacy of bystanders in the process, we will employ homomorphic encryption methods and secure function evaluation on encrypted signatures, and protect against cloud providers as well as unintended information flow by relying on methods developed in the preceding projects.