Christian Platzer
Vienna University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Christian Platzer.
international conference on web services | 2006
Florian Rosenberg; Christian Platzer; Schahram Dustdar
Web services gain momentum for developing flexible service-oriented architectures. Quality of service (QoS) issues are not part of the Web service standard stack, although non-functional attributes like performance, dependability or cost and payment play an important role for service discovery, selection, and composition. A lot of research is dedicated to different QoS models, at the same time omitting a way to specify how QoS parameters (esp. the performance related aspects) are assessed, evaluated and constantly monitored. Our contribution in this paper comprises: a) an evaluation approach for QoS attributes of Web services, which works completely service-and provider independent, b) a method to analyze Web service interactions by using our evaluation tool and extract important QoS information without any knowledge about the service implementation. Furthermore, our implementation allows assessing performance specific values (such as latency or service processing time) that usually require access to the server which hosts the service. The result of the evaluation process can be used to enrich existing Web service descriptions with a set of up-to-date QoS attributes, therefore, making it a valuable instrument for Web service selection
2014 Third International Workshop on Building Analysis Datasets and Gathering Experience Returns for Security (BADGERS) | 2014
Martina Lindorfer; Matthias Neugschwandtner; Lukas Weichselbaum; Yanick Fratantonio; Victor van der Veen; Christian Platzer
Android is the most popular smartphone operating system with a market share of 80%, but as a consequence, also the platform most targeted by malware. To deal with the increasing number of malicious Android apps in the wild, malware analysts typically rely on analysis tools to extract characteristic information about an app in an automated fashion. While the importance of such tools has been addressed by the research community, the resulting prototypes remain limited in terms of analysis capabilities and availability. In this paper we present ANDRUBIS, a fully automated, publicly available and comprehensive analysis system for Android apps. ANDRUBIS combines static analysis with dynamic analysis on both Dalvik VM and system level, as well as several stimulation techniques to increase code coverage. With ANDRUBIS, we collected a dataset of over 1,000,000 Android apps, including 40% malicious apps. This dataset allows us to discuss trends in malware behavior observed from apps dating back as far as 2010, as well as to present insights gained from operating ANDRUBIS as a publicly available service for the past two years.
recent advances in intrusion detection | 2010
Marco Balduzzi; Christian Platzer; Thorsten Holz; Engin Kirda; Davide Balzarotti; Christopher Kruegel
Recently, social networks such as Facebook have experienced a huge surge in popularity. The amount of personal information stored on these sites calls for appropriate security precautions to protect this data. In this paper, we describe how we are able to take advantage of a common weakness, namely the fact that an attacker can query popular social networks for registered e-mail addresses on a large scale. Starting with a list of about 10.4 million email addresses, we were able to automatically identify more than 1.2 million user profiles associated with these addresses. By automatically crawling and correlating these profiles, we collect detailed personal information about each user, which we use for automated profiling (i.e., to enrich the information available from each user). Having access to such information would allow an attacker to launch sophisticated, targeted attacks, or to improve the efficiency of spam campaigns. We have contacted the most popular providers, who acknowledged the threat and are currently implementing our proposed countermeasures. Facebook and XING, in particular, have recently fixed the problem.
ACM Transactions on Internet Technology | 2009
Christian Platzer; Florian Rosenberg; Schahram Dustdar
Increasingly, application developers seek the ability to search for existing Web services within large Internet-based repositories. The goal is to retrieve services that match the users requirements. With the growing number of services in the repositories and the challenges of quickly finding the right ones, the need for clustering related services becomes evident to enhance search engine results with a list of similar services for each hit. In this article, a statistical clustering approach is presented that enhances an existing distributed vector space search engine for Web services with the possibility of dynamically calculating clusters of similar services for each hit in the list found by the search engine. The focus is laid on a very efficient and scalable clustering implementation that can handle very large service repositories. The evaluation with a large service repository demonstrates the feasibility and performance of the approach.
Proceedings of the 2009 ICSE Workshop on Software Engineering for Secure Systems | 2009
Peter Wurzinger; Christian Platzer; Christian Ludl; Engin Kirda; Christopher Kruegel
Due to the increasing amount of Web sites offering features to contribute rich content, and the frequent failure of Web developers to properly sanitize user input, cross-site scripting prevails as the most significant security threat to Web applications. Using cross-site scripting techniques, miscreants can hijack Web sessions, and craft credible phishing sites. Previous work towards protecting against cross-site scripting attacks suffers from various drawbacks, such as practical infeasibility of deployment due to the need for client-side modifications, inability to reliably detect all injected scripts, and complex, error-prone parameterization. In this paper, we introduce SWAP (Secure Web Application Proxy), a server-side solution for detecting and preventing cross-site scripting attacks. SWAP comprises a reverse proxy that intercepts all HTML responses, as well as a modified Web browser which is utilized to detect script content. SWAP can be deployed transparently for the client, and requires only a simple automated transformation of the originalWeb application. Using SWAP, we were able to correctly detect exploits on several authentic vulnerabilities in popular Web applications.
service oriented software engineering | 2007
Anton Michlmayr; Florian Rosenberg; Christian Platzer; Martin Treiber; Schahram Dustdar
Service-oriented computing (SOC) receives a lot of attention from academia and industry as a means to develop flexible and dynamic software solutions. Facing the facts, service-oriented solutions are by far not as dynamic and adaptable as they claim to be. The initial idea of the SOA triangle to publish-find-bind-execute a service is often not implemented as envisioned due to a number of missing or wrongly-used concepts. In our ongoing VReSCO project, a service-oriented infrastructure is being developed which aims at solving a number of grand challenges currently evident in the SOC community. In this paper we present our initial work on providing a reasonable basis that addresses the issues of dynamic binding and invocation by leveraging a flexible solution based on notifications.
ieee symposium on security and privacy | 2009
Stefan Mitterhofer; Christopher Kruegel; Engin Kirda; Christian Platzer
One of the greatest threats that massively multiplayer online games face today is a form of cheating called botting. The authors propose an automated approach that detects bots on the server side based on character activity and is completely transparent to end users.
computer software and applications conference | 2015
Martina Lindorfer; Matthias Neugschwandtner; Christian Platzer
Android dominates the smartphone operating system market and consequently has attracted the attention of malware authors and researchers alike. Despite the considerable number of proposed malware analysis systems, comprehensive and practical malware analysis solutions are scarce and often short-lived. Systems relying on static analysis alone struggle with increasingly popular obfuscation and dynamic code loading techniques, while purely dynamic analysis systems are prone to analysis evasion. We present MARVIN, a system that combines static with dynamic analysis and which leverages machine learning techniques to assess the risk associated with unknown Android apps in the form of a malice score. MARVIN performs static and dynamic analysis, both off-device, to represent properties and behavioral aspects of an app through a rich and comprehensive feature set. In our evaluation on the largest Android malware classification data set to date, comprised of over 135,000 Android apps and 15,000 malware samples, MARVIN correctly classifies 98.24% of malicious apps with less than 0.04% false positives. We further estimate the necessary retraining interval to maintain the detection performance and demonstrate the long-term practicality of our approach.
Journal of Computer Virology and Hacking Techniques | 2011
Manuel Egele; Clemens Kolbitsch; Christian Platzer
Web spam denotes the manipulation of web pages with the sole intent to raise their position in search engine rankings. Since a better position in the rankings directly and positively affects the number of visits to a site, attackers use different techniques to boost their pages to higher ranks. In the best case, web spam pages are a nuisance that provide undeserved advertisement revenues to the page owners. In the worst case, these pages pose a threat to Internet users by hosting malicious content and launching drive-by attacks against unsuspecting victims. When successful, these drive-by attacks then install malware on the victims’ machines. In this paper, we introduce an approach to detect web spam pages in the list of results that are returned by a search engine. In a first step, we determine the importance of different page features to the ranking in search engine results. Based on this information, we develop a classification technique that uses important features to successfully distinguish spam sites from legitimate entries. By removing spam sites from the results, more slots are available to links that point to pages with useful content. Additionally, and more importantly, the threat posed by malicious web sites can be mitigated, reducing the risk for users to get infected by malicious code that spreads via drive-by attacks.
international conference on detection of intrusions and malware and vulnerability assessment | 2010
Hanno Fallmann; Gilbert Wondracek; Christian Platzer
Cyber-criminals around the world are using Internet-based communication channels to establish trade relationships and complete fraudulent transactions. Furthermore, they control and operate publicly accessible information channels that serve as marketplaces for the underground economy. In this work, we present a novel system for automatically monitoring these channels and their participants. Our approach is focused on creating a stealthy system, which allows it to stay largely undetected by both marketplace operators and participants. We implemented a prototype that is capable of monitoring IRC (Internet Relay Chat) and web forum marketplaces, and successfully performed an experimental evaluation over a period of 11 months. In our experimental evaluation we present the findings about the captured underground information channels and their characteristics.