Infrastructure software, such as operating systems and middleware, is large, diverse, and fast evolving, to meet changing needs in terms of hardware support, performance, and security. In practice, meeting these needs can involve repetitive code changes, and can introduce bugs, either due to inattention or due to misunderstanding of evolving interfaces. Some potentially valuable changes may even be considered to be to expensive or too risky to implement. To reduce the cost and risk of making pervasive code changes, we have developed the Coccinelle program matching and transformation tool for C code. Coccinelle makes it possible to automate pervasive source code evolutions and common bug finding tasks, via a patch-like notation that is accessible to C code developers. Coccinelle is widely used in the Linux kernel developer community and in the development of other open source projects.
This interactive tutorial will give an overview of Coccinelle. A number of real examples from the Linux kernel will be presented.
Julia Lawall is a Senior Researcher at Inria Paris-Rocquencourt. Her research interests are in the area of improving the quality of infrastructure software, using a variety of approaches including program analysis, program transformation, and the design of domain-specific languages. She is the main developer of the tool Coccinelle for program matching and transformation in C code. Over 1200 patches based on her research have been accepted into the Linux kernel.
This tutorial addresses the challenges for middleware to process events in large scale environments such as the Internet of Things (IoT). Traditional event-based middleware follows an interaction model based on three decoupling dimensions: space, time, and synchronization. However, event producers and consumers are tightly coupled by event semantics: types, attributes, and values. This limits scalability in heterogeneous environments due to difficulties in establishing semantic agreements. The tutorial will motivate and explore this problem within the context of a Smart City IoT deployment.
The tutorial grounds the discussion in the fundamental principles for information exchange between systems and presents a review of computational paradigms and discusses different approaches to the problem. It examines the advantages and disadvantages of each approach and concludes with discussion on how to build working software for heterongeous IoT environments.
Souleiman Hasan is a PhD researcher at the Insight Centre for Data Analytics @ NUI Galway. His research tackles the problem of semantic coupling in event based systems and its effect on scalability in large scale heterogeneous environments such as the Internet of Things. He investigates the approach to the problem in the smart cities, power management, and water management domains and has worked in many research projects including DERI Energy, SENSE, and Waternomics. Souleiman had completed a Bachelor’s degree in Information Technology Engineering, majoring in Software Engineering and Information Systems, at Damascus University, Syria. He has been a teacher assistant for several modules and coached the Damascus University team for the ACM Arab and North Africa Collegiate Programming Contest throughout 2007, 2008, and 2009.
Edward Curry is a Research Scientist Insight Centre for Data Analytics at NUI Galway. His research projects include studies of sustainable IT, energy intelligence, semantic information management, Event based systems, and collaborative data management. Edward has worked extensively with industry and government advising on the adoption patterns, practicalities, and benefits of new technologies.
Edward has published over 90 scientific articles in journal, books, and international conferences. He has given invited talks at Berkeley, Stanford, and MIT. In 2010 he was a guest speaker at the MIT Sloan CIO Symposium to an audience of 600+ CIOs and senior IT executives. He currently participates in a project for the European Commission to define a research strategy for the Big Data economy within Europe. He has a PhD from the National University of Ireland, Galway (www.nuigalway.ie) and serves as an Adjunct Lecturer within the University.
Data volumes are ever growing, for a large application spectrum going from traditional database applications, scientific simulations to emerging applications including Web 2.0 and online social networks. To cope with this added weight of Big Data, we have recently witnessed a paradigm shift in the way data is processed through the MapReduce model. First promoted by Google, MapReduce has become, due to the popularity of its open-source implementation Hadoop, the de facto programming paradigm for Big Data processing in large-scale data centers and clouds. Yet, to meet the ever-growing size of Big Data, Hadoop has been recently deployed on large-scale data centers equipped with thousands of servers that are energy hungry. This results in a tremendous increase in the energy consumed to operate these large-scale data centers and ends up with not only high money bills but also high carbon emission. The goal of this tutorial is to serve as a first step towards not only exploring the Hadoop MapReduce engine but also provide a deep insight into the challenges for making Big Data greener and discuss the main approaches developed in the literature.
Anne-Cécile Orgerie is a tenured research scientist at CNRS. She is working in the Myriads team at IRISA in Rennes, France. She received her PhD. degree in Computer Science from Ecole Normale Superieure de Lyon (France) in September 2011. Her PhD. thesis, entitled An energy-efficient reservation framework for large-scale distributed systems, was awarded the PhD thesis prize from the French chapter of ACM SIGOPS (ASF). Her research interests focus on energy efficiency, large-scale distributed systems and telecommunication networks from both practical and theoretical perspectives.
Shadi Ibrahim is a permanent Inria research scientist within the KerData research team. He obtained his Ph.D. in Computer Science from Huazhong University of Science and Technology in Wuhan of China in 2011. His research interests are in cloud computing, big data management, data-intensive computing, virtualization technology, and file and storage systems. He has published several research papers in recognized big data and cloud computing research journals and conferences, among which, several papers on optimizing and improving Hadoop MapReduce performance in the cloud and one book chapter on MapReduce framework.
Resource management is critical for application domains where components share their execution environments but belong to different stakeholders, such as smart homes or cloud systems. Yet, current middleware and application containers often hide system-level details needed for dynamic resource management. In particular, they tend to hide resource usage by offering automatic management of these resources (e.g., CPU, memory and I/O). In contrast, system-level containers, such as Linux Containers (LXC), allow fine-grain resource management. However, they lack knowledge about the application’s structure and its requirements in order to provide fine tuned resource management.
In this tutorial, we will expose Squirrel: a new middleware which aims at combining the benefits from the component based software engineering to design flexible and modular application and the system level containers to manage resources. Squirrel follows an approach where developers specify contracts on components and connections to describe the expected behavior of their application regarding resource consumption. These high-level contracts are then used to automatically configure the system-level containers which will hosts the running applications. At the end of this tutorial, applicants will be able to design applications and contracts using Squirrel and run their application inside system level containers to ensure a correct behavior of their application regarding resource consumption.
Johann Bourcier is an Associate Professor at the University of Rennes 1, where he is a member of the Triskell INRIA research team. He received his Ph.D. degree in 2008 from the University of Grenoble 1. He is a former member of the LIG Adele research group (Grenoble) and later the Distributed Software Engineering Section in the Department of Computing of Imperial College London. His research interests include the use of software engineering to simplify the development of highly dynamic pervasive applications. Johann Bourcier has co-authored 18 international peer-reviewed conference, journal and book chapters in venue such as TAAS, GECCO, SEAMS, WICSA, SCC, …
Inti Gonzalez-Herrera is a Ph.D. Student at the University of Rennes 1, where he is a member of the DiverSE INRIA research team. He received his M.Sc. degree in 2010 from the University of Las Villas, Cuba. The topic of his Ph.D. is about providing new mechanisms to guarantee resource reservation in Java-based pervasive middleware. In a broad sense, his research interests include the use of software engineering and system programming to guarantee non-functional properties. Inti Gonzalez-Herrera has co-authored 5 international peer-reviewed conference and workshop papers in the fields of Computer Science and Applied Computing.