Showing posts with label special encryption. Show all posts
Showing posts with label special encryption. Show all posts

Tuesday, September 6, 2016

Requirements and security goals for the Cloud

This is a two-part blog post giving an overview on the requirements and security goals for the Internet of Things (IoT) and the Cloud.

But first ... I'd like to mention that soon the ECRYPT-NET Summer School on cryptography for the cloud in Leuven is coming up.
This blog post can be seen as an intro to cryptology in the cloud; so there are some topics we will for sure hear about at the event in depth. I'm looking forward to a good discourse during the scheduled talks and to the discussions afterwards.

Part I: Requirements and security goals for the Cloud


The cloud here is loosly defined as computing and storage resources out-sourced to servers located off-site that are on-demand accessible via Internet.

Overview of cloud computing services to be secured
(Image by Wikimedia user Sam Johnston)
The usage of cloud services should thus be viewed as a potential threat, because inherently critical data is released from personal control and uploaded and so immediately raising several security as well privacy issues.

Requirements and the according cryptologic counter-measures are defined for various use-cases (such as depicted in the image) and briefly explained. It is important to realize that this needs to be done without deliberately weakening (or "back-dooring") solutions, as is sometimes suggested. To ensure democratic use of the developed technologies it is vital to see that the difference of a "front door" and a "back door" is merely ones viewpoint. Intentionally implementing two entrances makes an attacker happy not the legitimate user.

Transparent services, meaning the use of the cloud as if it was on-site, should ideally be built upon a back-end with verifiable, clear, possibly standardized, cryptologic concepts and open code for public scrutiny.

To capture the real-world cloud settings one has to view it from different perspectives: First the one of a private entity (user), then that of an organization (such as a company) and lastly the global perspectives of, say, a government.

An interesting starting-point for threats we have to consider, is the Cloud Security Alliance's 2016 report for cloud security naming twelve critical issues ranked by severity of real-world impact.

Data breaches is followed directly by weak identity, credential and access management and insecure Application Programming Interfaces (APIs) in this report --- issues that can be addressed by assessing requirements and tailoring cryptographic solutions from the beginning and deploying state-of-the-art implementations instead of sticking to legacy code.


Roughly four categories of requirements for the usages of the cloud can be distinguished:
  • Computations in the Cloud
  • Sharing Data in the Cloud
  • Information Retrieval from the Cloud
  • Privacy Preservation in the Cloud
The approach to secure these areas are manifold. General speaking, while data minimization is a fundamental paradigm for efficient interaction with the cloud security-by-design --- considering best-practices from the beginning --- is enabled by encapsulation of scopes of methods.
The following, briefly explained, concepts tackle concrete use-cases for end-users, companies and e-Government tasks alike:
  • Order-Preserving Encryption (OPE) allows efficient range queries but does it diminish security too much?
  • Format-Preserving Encryption (FPE) offers in-place encryptions in legacy databases.
  • Data De-Duplication (DDD) is enhancing servers' back-end performance.
  • Secret Sharing Schemes (SSS) is solving back-up and availability issues by distributing encrypted parts of the whole.
  • Malleable Signature Schemes (MSS) is providing flexible authentication for documents and derived data.
  • Private Information Retrieval (PIR) privately accessing elements in a database.
  • Provable Data Possession (PDP) ensures that the cloud is storing the (uncorrupted) files.
  • Multi-Party Computation (MPC) allows secure cooperation across the Internet.
  • Verifiable Computation (VC) builds trust in results of a delegated computation.
We can observe that the cryptology community is striving for efficient cryptographic primitives that serve as building-blocks for implementations. Naturally researchers base their constructions on well-established security assumptions and derive clear security proofs to offer i.e. long-term confidentiality. This is not at least important because of the emergence of future quantum computers with powerful quantum algorithms posing a threat and needs to be addressed today before the transition of current solutions to the cloud, where the constructions would be exposed and would possibly be vulnerable to a quantum attacker.

Let's hope this trend continues and ultimately leads to reliable, standardized primitives for real-world applications.


Stay tuned for Part II: Requirements and security goals for the IoT...

Monday, January 18, 2016

Sixth Bar-Ilan University Winter School on Cryptography (Part 2)

This two-part blog post has been collaboratively written by Eduardo, Marie-Sarah, Matthias and Ralph.

On Wednesday, the second part (special encryption) of the school began.

First, Mor Weiss introduced us to format-preserving encryption (FPE) and explained the importance of tweakable block ciphers. Tweaks are randomized, public values that provide more randomness to block ciphers (kind of like salts in hash functions). Alexandra Boldyreva taught us about some inherent weaknesses of deterministic (i.e., equality-preserving) and order-preserving encryption. Hugo Krawczyk presented the details of ESPADA/OXT. Ben Fisch showed us how a tree of Bloom filters can be used to search on encrypted data (Blind Seer).

The favourite talk of Matthias was given by Benny Pinkas on Thursday. The title was "Searchable Encryption using ORAM" and the argument for choosing that approach over the others presented before, was that, for better security guarantees, it is desirable that the sequence of memory accesses to some encrypted data reveals no more information than the running-time. Oblivious RAM (ORAM) can be informally defined as follows:

A program $P$ is called "memory oblivious" if the memory access pattern $AP(x)$ (a sequence of $\verb!read!(a)$ or $\verb!write!(a,v)$ operations) is independent of the input $x=x(i,a,v)$ (which is a function of an instruction, address and value).
More formally an oblivious memory access pattern of $P$ shall satisfy:
$$
\forall x\neq y: \left\vert x\right\vert =\left\vert y\right\vert \Longrightarrow AP(x)\approx AP(y).
$$
The "ad-hoc solution" requires $\mathcal{O}(nm)$ time and $\mathcal{O}(m)$ memory upon storing $n$ encrypted pairs $(a, v)$ of equal size $m$ in memory using a randomized encryption scheme and going through every memory cell.
If the addresses don't match, re-encrypt and overwrite data cell, else, perform the instruction, encrypt and write results.

In this one-hour talk Pinkas then focused on the simplest tree based ORAM scheme, but gave possible ways to reduce the overhead i.e. by using ORAM schemes recursively.
The basic ORAM scheme consists of the server storing a full binary tree with $\log n$ levels and $n$ leaves, where each intermediate node contains $\log n$ data items.
The client randomly maps items to leaves, which results in $\mathcal{O}(n)$ storage of $\log n$ bits.
Accessing an item requires to traverse the path, from root to leaf, obtained from the position map. In each bucket along the path remove the data item if found and assign a new random leaf to it.  Upon updating the client's position map write the updated item to the root node.
To prevent overflows perform the following oblivious operations in each level:
Choose two nodes at random and pop an item from both (non-empty) buckets, move the item downwards to next node on its path and perform a dummy write operation to the other descendant of the node to hide the operation.

To summarize: The server storage holds $\mathcal{O}(n \log n)$ data items, the client stores $n$ indices requiring $\mathcal{O}(n \log n)$ bits and each access costs $\mathcal{O}(\log^2 n)$ $\verb!read!$ and $\verb!write!$ operations.

Finally he discussed security limitations, efficiency problems and presented encrypted search using ORAM:
A straight-forward solution, given $n$ data items and $k$ keywords assigned to some, is to store $\leq nm$ tuples of the form ($k$, $\langle$data item associated with $k\rangle$ ) which in general requires storing each item multiple times.
Using two ORAMs - the first to store the documents indices given a keyword associated with it (="an inverted index list"), and the second ORAM to store the data items themselves, solves the problem of performing encrypted search more elegantly without the server being able to identify which item is being read nor whether a specific data item is accessed more often.
Sadly, Benny Pinkas also pointed to a result from Naveed's 2015 eprint report --often, sending the entire database to a client is more efficient than doing searchable encryption with ORAM-- so, as with the first part of the Winter School, there is still room for improvements!

We hope we were able to give you a flavour of how amazing this event was with these two posts. If not, or if you want to deepen in the talks, it's your lucky day: They are all uploaded! (We will post a comment when they are online)

Ecrypt-Net fellows in Caesarea


A sincere "Toda!" to the organizers, Yehuda Lindell, Benny Pinkas, and Yonit Homburger, for such a pleasant school!