Previous years

2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | 2008 | 2007 | 2006
 

2015

GIESSMANN Andrea - Designing business models of cloud platforms

Cloud computing and its three facets (Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS)) are terms that denote new developments in the software industry. In particular, PaaS solutions, also referred to as cloud platforms, are changing the way software is being produced, distributed, consumed, and priced. Software vendors have started considering cloud platforms as a strategic option but are battling to redefine their offerings to embrace PaaS. In contrast to SaaS and IaaS, PaaS allows for value co-creation with partners to develop complementary components and applications. It thus requires multisided business models that bring together two or more distinct customer segments. Understanding how to design PaaS business models to establish a flourishing ecosystem is crucial for software vendors. This doctoral thesis aims to address this issue in three interrelated research parts. First, based on case study research, the thesis provides a deeper understanding of current PaaS business models and their evolution. Second, it analyses and simulates consumers’ preferences regarding PaaS business models, using a conjoint approach to find out what determines the choice of cloud platforms. Finally, building on the previous research outcomes, the third part introduces a design theory for the emerging class of PaaS business models, which is grounded on an extensive action design research study with a large European software vendor. Understanding PaaS business models from a market as well as a consumer perspective will, together with the design theory, inform and guide decision makers in their business model innovation plans. It also closes gaps in the research related to PaaS business model design and more generally related to platform business models.

BUESSER Pierre - Evolutionary games on weighted and spatial networks

This work investigates for the most part cooperation dilemmas in society when the population is structured as a complex network or when agents lay in space and can migrate. Cyclic games are also investigated in the framework of migration. Cooperation is modeled by two-player games where two players can choose between two available strategies which are cooperation and defection. Among other games, we study the prisoner’s dilemma. In that game mutual cooperation is the best choice but the structure of the game leads selfish agents to both defect. This is due to the fact that the temptation to defect is strong and the so called sucker payoff earned by a cooperator against a defector is very low. Using this game and others, we first study the evolution of cooperation on weighted networks and on spatial networks. Then we study the evolution of cooperation when the players can migrate in space in order to improve their payoffs. We find that when the weights are attributed according to some degree-weight correlations on a social network the cooperation can be strongly improved. In a second part we show that particular spatial hierarchical topologies which are embedded in space lead to particularly high levels of cooperation. In a third part, exploring migration, we find that when agents imitate their neighbors randomly while they migrate opportunistically, cooperation spreads in the population.

ATAEE Shabnam - An Architecture for Adaptive Replication-Based Multimedia Streaming in P2P Networks

A large portion of the Internet traffic today is due to media streaming and this trend is still growing, as testified by the success of services like Skype, Spotify and Netflix. Media streaming consists in sending video or audio content in a continuous flow of data over the Internet and in playing this content at its arrival. Since computing resources such as bandwidth, memory and processing are limited, delivering multimedia content in a scalable manner is a key challenge. This PhD thesis addresses the issue of scalable media streaming in large-scale networks.

The client-server model is a common approach to streaming, where media consumers (clients) establishes a connection with a media server, somewhere on the Internet. In this model, when the number of consumers increases, more dedicated servers must be added to the system, which tends to be expensive. The peer-to-peer (P2P) approach offers an alternative and naturally scalable solution, where each peer can act as both client and server. Most of the proposed P2P streaming solutions focus on routing to achieve scalability. However, routing alone is limited when resources are insufficient, which is where replication can help.

In this thesis, we propose a family of replication-based streaming protocols. Our first two protocols, named ScaleStream and ReStream, adaptively replicate media content in different peers, based on the demand in the neighborhood of each peer, in order to increase the number of consumers that can be served in parallel. These solutions are adaptive in the sense that they take into account resources constraints like bandwidth capacity of peers, in order to decide when to add or remove replicas. Our two last protocols, named EagleMacaw and TurboStream, are also replication-based but they in addition optimize media routing to improve efficiency and reliability, and to reduce latency.

ROSSELET Ulysse - Impacts des technologies de l'information sur les modes de coordination à travers quatre études en systèmes d'information

L’évolution de l’environnement économique, des chaînes de valeur et des modèles d’affaires des organisations augmentent l’importance de la coordination, qui peut être définie comme la gestion des interdépendances entre des tâches réalisées par des acteurs différents et concourants à un objectif commun. De nombreux moyens sont mis en œuvre au sein des organisations pour gérer ces interdépendances. A cet égard, les activités de coordination bénéficient massivement de l’appui des technologies de l’information et de communication (TIC) qui sont désormais disséminées, intégrées et connectées sous de multiples formes tant dans l’environnement privé que professionnel. Dans ce travail, nous avons investigué la question de recherche suivante : comment l’ubiquité et l’interconnectivité des TIC modifient-elles les modes de coordination ?

A travers quatre études en systèmes d’information conduites selon une méthodologie design science, nous avons traité cette question à deux niveaux : celui de l’alignement stratégique entre les affaires et les systèmes d’information, où la coordination porte sur les interdépendances entre les activités ; et celui de la réalisation des activités, où la coordination porte sur les interdépendances des interactions individuelles. Au niveau stratégique, nous observons que l’ubiquité et l’interconnectivité permettent de transposer des mécanismes de coordination d’un domaine à un autre. En facilitant différentes formes de coprésence et de visibilité, elles augmentent aussi la proximité dans les situations de coordination asynchrone ou distante. Au niveau des activités, les TIC présentent un très fort potentiel de participation et de proximité pour les acteurs. De telles technologies leur donnent la possibilité d’établir les responsabilités, d’améliorer leur compréhension commune et de prévoir le déroulement et l’intégration des tâches.

La contribution principale qui émerge de ces quatre études est que les praticiens peuvent utiliser l’ubiquité et l’interconnectivité des TIC pour permettre aux individus de communiquer et d’ajuster leurs actions pour définir, atteindre et redéfinir les objectifs du travail commun.

ANTONIONI Alberto - Evolutionary games in networked populations: models and experiments

Cooperation and coordination are desirable behaviors that are fundamental for the harmonious development of society. People need to rely on cooperation with other individuals in many aspects of everyday life, such as teamwork and economic exchange in anonymous markets. However, cooperation may easily fall prey to exploitation by selfish individuals who only care about short-term gain. For cooperation to evolve, specific conditions and mechanisms are required, such as kinship, direct and indirect reciprocity through repeated interactions, or external interventions such as punishment.

In this dissertation we investigate the effect of the network structure of the population on the evolution of cooperation and coordination. We consider several kinds of static and dynamical network topologies, such as Barabási-Albert, social network models and spatial networks. We perform numerical simulations and laboratory experiments using the Prisoner's Dilemma and coordination games in order to contrast human behavior with theoretical results.

Thesis in joint-supervision with the University of Madrid (Carlos III)

OCHOA Camila - Generation and interconnection capacity expansion in cross-border electricity markets: the need for policy coordination

Electricity is a strategic service in modern societies. Thus, it is extremely important for governments to be able to guarantee an affordable and reliable supply, which depends to a great extent on an adequate expansion of the generation and transmission capacities. Cross-border integration of electricity markets creates new challenges for the regulators, since the evolution of the market is now influenced by the characteristics and policies of neighbouring countries.

There is still no agreement on why and how regions should integrate their electricity markets. The aim of this thesis is to improve the understanding of integrated electricity markets and how their behaviour depends on the prevailing characteristics of the national markets and the policies implemented in each country.

We developed a simulation model to analyse under what circumstances integration is desirable. This model is used to study three cases of interconnection between two countries. Several policies regarding interconnection expansion and operation, combined with different generation capacity adequacy mechanisms, are evaluated.

The thesis is composed of three papers. In general, we conclude that electricity market integration can bring benefits if the right policies are implemented. However, a large interconnection capacity is only desirable if the countries exhibit significant complementarity and trust each other. The outcomes of policies aimed at guaranteeing security of supply at a national level can be quite counterintuitive due to the interactions between neighbouring countries and their effects on interconnection and generation investments.

Thus, it is important for regulators to understand these interactions and coordinate their decisions in order to take advantage of the interconnection without putting security of supply at risk. But it must be taken into account that even when integration brings benefits to the region, some market participants lose and might try to hinder the integration process.

PEQUIGNOT Yann - Better-quasi-order: Ideals and Spaces

This thesis deals with combinatorics, order theory and descriptive set theory.
The first contribution is to the theory of well-quasi-orders (wqo) and better-quasi-orders (bqo). The main result is the proof of a conjecture made by Maurice Pouzet in 1978 his thèse d’état which states that any wqo whose ideal completion remainder is bqo is actually bqo. Our proof relies on new results with both a combinatorial and a topological flavour concerning maps from a front into a compact metric space. The second contribution is of a more applied nature and deals with topological spaces. We define a quasi-order on the subsets of every second countable T0 topological space in a way that generalises the Wadge quasi-order on the Baire space, while extending its nice properties to virtually all these topological spaces.
The Wadge quasi-order of reducibility by continuous functions is wqo on Borel subsets of the Baire space, this quasi-order is however far less satisfactory for other important topological spaces such as the real line, as Hertling, Ikegami and Schlicht notably ob-served. Some authors have therefore studied reducibility with respect to some classes of discontinuous functions to remedy this situation. We propose instead to keep continuity but to weaken the notion of function to that of relation. Using the notion of admissible representation studied in Type-2 theory of effectivity, we define the quasi-order of re-ducibility by relatively continuous relations. We show that this quasi-order both refines the classical hierarchies of complexity and is wqo on the Borel subsets of virtually every second countable T0 space – including every (quasi-)Polish space.

Thesis in joint-supervision with the Université Paris-Diderot (Paris 7)

METRAILLER Alexandre - Evolis : un cadre conceptuel pour l'étude de l'évolution des systèmes d'information

Dans cette thèse, nous étudions les évolutions des systèmes d’information. Nous nous intéressons plus particulièrement à l’étude des facteurs déclencheurs d’évolution, ce qu’ils représentent et comment ils permettent d’en apprendre d’avantage sur le cycle de vie des systèmes d’information.

 

Pour ce faire, nous avons développé un cadre conceptuel pour l’étude des évolutions qui tient compte non seulement des facteurs déclencheurs d’évolution, mais également de la nature des activités entreprises pour évoluer. Nous avons suivi une approche Design Science pour la conception de ce cadre conceptuel. Selon cette approche, nous avons développé itérativement le cadre conceptuel en l’instanciant puis en l’évaluant afin de raffiner sa conception. Ceci nous a permis de faire plusieurs contributions tant pratiques que théoriques.

 

La première contribution théorique de cette recherche est l’identification de 4 facteurs principaux déclenchant les évolutions. Ces facteurs sont des éléments issus de domaines généralement étudiés séparément. Le cadre conceptuel les rassemble dans un même outil pour l’étude des évolutions. Une autre contribution théorique est l’étude du cycle de vie des systèmes selon ces facteurs. En effet, l’utilisation répétée du cadre conceptuel pour la qualification des évolutions met en lumière les principales motivations des évolutions lors de chaque étape du cycle de vie. En comparant les évolutions de plusieurs systèmes, il devient possible de mettre en évidence des modèles spécifiques d’évolution des systèmes.

 

Concernant les contributions pratiques, la principale concerne le pilotage de l’évolution. Pour un gestionnaire de système d’information, l’application du cadre conceptuel permet de connaître précisément l’allocation réelle des ressources pour une évolution ainsi que la localisation du système dans son cycle de vie. Le cadre conceptuel peut donc aider les gestionnaires dans la planification et la stratégie d’évolution du système. Les modèles d’évolution, identifiés suite à l’application du cadre conceptuel, sont également une aide précieuse pour définir la stratégie de pilotage et les activités à entreprendre lors de la planification des évolutions.

 

Finalement, le cadre conceptuel a fourni les bases nécessaires à l’élaboration d’un tableau de bord pour le suivi du cycle de vie et le pilotage de l’évolution des systèmes d’information.

2014

TUNKELO Teemu Mauno - Three essays on major trends in a slow clockspeed industry: the case of industrial automation

Post-industrial societies depend on efficiency and sustainability of their industrial production which is controlled by specialized industrial automation computer systems. Industrial automation is dominated by global companies with proprietary solutions and relies on technologies largely replaced in other computer markets. Companies operating in this mature market constantly improve their operations and manage technology disruptions. Related decisions are based on a combination of facts, emotions and personal agendas. We study three trends in industrial automation with two research projects observing competitiveness improvements through outsourcing and process improvement and one exploring the management outlook concerning rapid technology developments in adjacent high volume markets.

Globalization moves industrial facilities between continents and creates larger units. Responding to the changing environment requires companies to focus on core competences, process improvements and customer experience. Core competence focus leads to outsourcing or insourcing of selected activities. We research captive outsourcing, a novel outsourcing model, where the outsourced unit located in an emerging country is an integral part of the company’s operations, not an external supplier. It is not merely a low cost engineering pool but has responsibility for complete subsystems. Since employee commitment and low attrition rate are key for success of this model, the company focuses on employee satisfaction and develop a brand as an good local employer. Next we research support process improvement from the customer perspective and implement a new support process based on an end-to-end lead-time measurement system for reduction and faster resolution of customer issues.

New System-on-Chip technologies are disrupting Information and Communications Technology (ICT) markets. Shipment volumes of smartphones and tablets exceed all earlier computing technologies. As the last trend we research management views on the future from the perspectives of customers, incumbents and newcomers. To benefit from the new technologies a disruption management function is considered necessary and a new quantitative model for disruption assessment is proposed.

SIMMS David John - A confluence of risks: control and compliance in the world of unstructured data, big data and the cloud

The emergence of powerful new technologies, the existence of large quantities of data, and increasing demands for the extraction of added value from these technologies and data have created a number of significant challenges for those charged with both corporate and information technology management. The possibilities are great, the expectations high, and the risks significant. Organisations seeking to employ cloud technologies and exploit the value of the data to which they have access, be this in the form of “Big Data” available from different external sources or data held within the organisation, in structured or unstructured formats, need to understand the risks involved in such activities. Data owners have responsibilities towards the subjects of the data and must also, frequently, demonstrate that they are in compliance with current standards, laws and regulations.

This thesis sets out to explore the nature of the technologies that organisations might utilise, identify the most pertinent constraints and risks, and propose a framework for the management of data from discovery to external hosting that will allow the most significant risks to be managed through the definition, implementation, and performance of appropriate internal control activities.

ZUMWALD Pierre - Apport de la systémique dans l’amélioration de la compréhension des systèmes de prise en compte des risques en entreprise

"Contribution of systemic science in the improvement of understanding of risk management system" offers a holistic view of enterprise wise risk management.

Risk management is often assessed through linear methods which stress positioning and causal logical frameworks: to such events correspond such consequences and such risks accordingly. Consideration of the interrelationships between risks is often overlooked and risks are rarely analyzed in their dynamic and nonlinear components.

This work shows what systemic methods, including the study of complex systems, are likely to bring to knowledge, management, anticipation of business risks, both on the conceptual and the practical sides. Based on the definitions of systems and risks in various areas, as well as methods used to manage risk, this work confronts these concepts with approaches of complex systems analysis and modeling.

This work highlights the reducing effects of some business risk analysis methods as well as limitations of risk universes caused in particular by unsuitable definitions. As a result this work also provides chief officers with a range of different tools and approaches which allows them a better understanding of complexity and as such a gain in efficiency in their risk management practices. It results in a better fit between strategy and risk management. Ultimately the firm gains in its maturity of risk management.

BEKKOUCHE Amine - Cyberadministration : un tremplin pour les pays émergents

We are currently witnessing a distribution of Information and Communication Technologies (ICT) on a global scale. Yet, this distribution is carried out in different rhythms within each nation (and even among regions in a given country), which creates a “digital” gap, in addition to multiple inequalities already present. This computing and technological revolution engenders many changes in social relationships and permits numerous applications that are destined to simplify our lives.

Amine Bekkouche takes a closer look at the issue of e-government as an important consequence of ICTs, following the example of electronic commerce. First, he presents a synthesis of the main concepts in e-government as well as a panoramic view of the global situation in this domain.

Subsequently, he studies e-government in view of emerging countries, in particular through the illustration of a country in representative development. Then, he offers concrete solutions, which take the education sector as their starting point, to allow for a “computed digitalisation” of society that contribute to reduce the digital gap. Thereafter, he broadens these proposals to other domains and formulates recommendations that help their implementation. Finally, he concludes with perspectives that may constitute further research tracks and enable the elaboration of development projects, through the appropriation of ICTs, in order to improve the condition of the administered, and more generally, that of the citizen. 

DAOLIO Fabio - Local optima networks of hard combinatorial landscapes

Many everyday life problems involve finding an optimal solution among a finite set of possibilities, deemed the problem search space. In practice, enumerating all the possibilities becomes infeasible beyond a given problem size, but there exist approximate methods. In the most general case, these methods start with a candidate solution and gradually refine it through partial modifications until no improvement is possible. The variation operation, by connecting candidate solutions, induces a neighborhood structure in the search space, such that the search process can be described as a trajectory over this configuration space. Heuristic methods try to guide the search towards better solutions. Their performance, therefore, depends on the structure of the space being searched.

In this thesis, we analyze such structure by looking at the graph having as nodes solutions that are locally optimal and that act as attractors to the search trajectory, and as edges the possible transitions between those local optima. This allows us to employ methods from the science of complex networks in order to characterize in a novel way the search space of hard combinatorial problems; we argue that such network characterization can advance our understanding of the structural and dynamical properties of these spaces.

We investigate several methodologies to build the network of local optima and we apply our approach to prototypical problems such as the Quadratic Assignment Problem, the NK model of rugged landscapes, and the Permutation Flow-shop Scheduling Problem. We show that some network metrics can differentiate problem classes, correlate with problem non-linearity, and help to predict problem hardness as measured from the performances of trajectory-based search heuristics.

HAKI ESNAFKAREH Mohammad Kazem - Essays on the principles and adoption of enterprise-wide architecture

Enterprise-wide architecture has become a necessity for organizations to (re)align information technology (IT) to changing business requirements. Since a city planning metaphor inspired enterprise-wide architecture, this dissertation’s research axes can be outlined by similarities between cities and enterprises. Both are characterized as dynamic super-systems that need to address the evolving interest of various architecture stakeholders. Further, both should simultaneously adhere to a set of principles to guide the evolution of architecture towards the expected benefits. The extant literature on enterprise-wide architecture not only disregards architecture adoption’s complexities but also remains vague about how principles guide architecture evolution. To bridge this gap, this dissertation contains three interrelated research streams examining the principles and adoption of enterprise-wide architecture.

The first research stream investigates organizational intricacies inherent in architecture adoption. It characterizes architecture adoption as an ongoing organizational adaptation process. By analyzing organizational response behaviors in this adaptation process, it also identifies four archetypes that represent very diverse architecture approaches. The second research stream ontologically clarifies the nature of architecture principles along with outlining new avenues for theoretical contributions. This research stream also provides an empirically validated set of principles and proposes a research model illustrating how principles can be applied to generate expected architecture benefits. The third research stream examines architecture adoption in multinational corporations (MNCs). MNCs are specified by unique organizational characteristics that constantly strive for balancing global integration and local responsiveness. This research stream characterizes MNCs’ architecture adoption as a continuous endeavor. This endeavor tries to constantly synchronize architecture with stakeholders’ beliefs about how to balance global integration and local responsiveness.

To conclude, this dissertation provides a thorough explanation of a long-term journey in which organizations learn over time to adopt an effective architecture approach. It also clarifies the role of principles to purposefully guide the aforementioned learning process. 

FRITSCHER Boris - Computer aided business model design

There is a lack of dedicated tools for business model design at a strategic level. However, in today’s economic world the need to be able to quickly reinvent a company’s business model is essential to stay competitive. This research focused on identifying the functionalities that are necessary in a computer-aided design (CAD) tool for the design of business models in a strategic context. Using design science research methodology a series of techniques and prototypes have been designed and evaluated to offer solutions to the problem. The work is a collection of articles which can be grouped into three parts:

First establishing the context of how the Business Model Canvas (BMC) is used to design business models and explore the way in which CAD can contribute to the design activity.

The second part extends on this by proposing new technics and tools which support elicitation, evaluation (assessment) and evolution of business models design with CAD. This includes features such as multi-color tagging to easily connect elements, rules to validate coherence of business models and features that are adapted to the correct business model proficiency level of its users. A new way to describe and visualize multiple versions of a business model and thereby help in addressing the business model as a dynamic object was also researched.

The third part explores extensions to the business model canvas such as an intermediary model which helps IT alignment by connecting business model and enterprise architecture. And a business model pattern for privacy in a mobile environment, using privacy as a key value proposition.

The prototyped techniques and proposition for using CAD tools in business model modeling will allow commercial CAD developers to create tools that are better suited to the needs of practitioners.

LIU Zhan - Adaptive privacy management system design for context-aware mobile devices

While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user’s behaviors, movement and habits can be associated with a consumer’s personal identity.  
In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user’s personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user’s context information to protect user’s private information for smartphone. To validate this solution, I first proved that user’s context is a unique user identifier and context awareness technology can increase user’s perceived ease of use of the system and service provider’s authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called “Privacy Manager”. I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users’ mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user’s privacy concerns it may cause. By pointing out new opportunities to rethink how user’s context information can protect private data, it also suggests new elements for privacy related business models. 

EMAD Sabine - Leveraging sandbox immersive 3D virtual worlds to develop do-it-yoursel teaching games: implementation to marketing case study teaching in second life

Games are powerful and engaging. On average, one billion people spend at least 1 hour a day playing computer and videogames. This is even more true with the younger generations. Our students have become the « digital natives », the « gamers », the « virtual generation ». Research shows that those who are most at risk for failure in the traditional classroom setting, also spend more time than their counterparts, using video games. They might strive, given a different learning environment.  
Educators have the responsibility to align their teaching style to these younger generation learning styles. However, many academics resist the use of computer-assisted learning that has been “created elsewhere”. This can be extrapolated to game-based teaching: even if educational games were more widely authored, their adoption would still be limited to the educators who feel a match between the authored games and their own beliefs and practices. Consequently, game-based teaching would be much more widespread if teachers could develop their own games, or at least customize them. Yet, the development and customization of teaching games are complex and costly.  
This research uses a design science methodology, leveraging gamification techniques, active and cooperative learning theories, as well as immersive sandbox 3D virtual worlds, to develop a method which allows management instructors to transform any off-the-shelf case study into an engaging collaborative gamified experience. This method is applied to marketing case studies, and uses the sandbox virtual world of Second Life. 

2013

LÄNGER Thomas - Information security and the enforcement of secrecy: the practical application of quantum key distribution

There is no doubt about the necessity of protecting digital communication: Citizens are entrusting their most confidential and sensitive data to digital processing and communication, and so do governments, corporations, and armed forces. Digital communication networks are also an integral component of many critical infrastructures we are seriously depending on in our daily lives. Transportation services, financial services, energy grids, food production and distribution networks are only a few examples of such infrastructures. Protecting digital communication means protecting confidentiality and integrity by encrypting and authenticating its contents. But most digital communication is not secure today. Nevertheless, some of the most ardent problems could be solved with a more stringent use of current cryptographic technologies.
Quite surprisingly, a new cryptographic primitive emerges from the ap-plication of quantum mechanics to information and communication theory: Quantum Key Distribution. QKD is difficult to understand, it is complex, technically challenging, and costly-yet it enables two parties to share a secret key for use in any subsequent cryptographic task, with an unprecedented long-term security. It is disputed, whether technically and economically fea-sible applications can be found.
Our vision is, that despite technical difficulty and inherent limitations, Quantum Key Distribution has a great potential and fits well with other cryptographic primitives, enabling the development of highly secure new applications and services. In this thesis we take a structured approach to analyze the practical applicability of QKD and display several use cases of different complexity, for which it can be a technology of choice, either because of its unique forward security features, or because of its practicability.

GALOFARO Serge - Evaluation d'une méthode de modélisation et d'appréhension de la complexité
VESSAZ François - System-level support for mobile ad hoc communications: an algorithmic & practical approach

A mobile ad hoc network (MANET) is a decentralized and infrastructure-less network. This thesis aims to provide support at the system-level for developers of applications or protocols in such networks. To do this, we propose contributions in both the algorithmic realm and in the practical realm. In the algorithmic realm, we contribute to the field by proposing different context-aware broadcast and multicast algorithms in MANETs, namely six-shot broadcast, six-shot multicast, PLAN-B and ageneric algorithmic approach to optimize the power consumption of existing algorithms. For each algorithm we propose, we compare it to existing algorithms that are either probabilistic or context-aware, and then we evaluate their performance based on simulations. We demonstrate that in some cases, context-aware information, such as location or signal-strength, can improve the effciency. In the practical realm, we propose a testbed framework, namely ManetLab, to implement and to deploy MANET-specific protocols, and to evaluate their performance. This testbed framework aims to increase the accuracy of performance evaluation compared to simulations, while keeping the ease of use offered by the simulators to reproduce a performance evaluation. By evaluating the performance of different probabilistic algorithms with ManetLab, we observe that both simulations and testbeds should be used in a complementary way. In addition to the above original contributions, we also provide two surveys about system-level support for ad hoc communications in order to establish a state of the art. The first is about existing broadcast algorithms and the second is about existing middleware solutions and the way they deal with privacy and especially with location privacy.

CARROY Raphael - Fonctions de première classe de Baire

Thesis in joint-supervision with the Université Paris-Diderot

2012

MOHAN Kunal - Understanding the acceptance and usage of project management methodologies
DELGADO Carlos - Behavioural queueing: cellular automata and a laboratory experiment

Queuing is a fact of life that we witness daily. We all have had the experience of waiting in line for some reason and we also know that it is an annoying situation. As the adage says "time is money"; this is perhaps the best way of stating what queuing problems mean for customers. Human beings are not very tolerant, but they are even less so when having to wait in line for service. Banks, roads, post offices and restaurants are just some examples where people must wait for service.

Studies of queuing phenomena have typically addressed the optimisation of performance measures (e.g. average waiting time, queue length and server utilisation rates) and the analysis of equilibrium solutions. The individual behaviour of the agents involved in queueing systems and their decision making process have received little attention. Although this work has been useful to improve the efficiency of many queueing systems, or to design new processes in social and physical systems, it has only provided us with a limited ability to explain the behaviour observed in many real queues.

In this dissertation we differ from this traditional research by analysing how the agents involved in the system make decisions instead of focusing on optimising performance measures or analysing an equilibrium solution. This dissertation builds on and extends the framework proposed by van Ackere and Larsen (2004) and van Ackere et al. (2010). We focus on studying behavioural aspects in queueing systems and incorporate this still underdeveloped framework into the operations management field.

BEN AYED Ghazi - Architecting User-Centric Privacy-as-a-Set-of-Services: Digital Identity Related Privacy Framework

Digitalization gives to the Internet the power by allowing several virtual representations of reality, including that of identity. We leave an increasingly digital footprint in cyberspace and this situation puts our identity at high risks. Privacy is a right and fundamental social value that could play a key role as a medium to secure digital identities. Identity functionality is increasingly delivered as sets of services, rather than monolithic applications. So, an identity layer in which identity and privacy management services are loosely coupled, publicly hosted and available to on-demand calls could be more realistic and an acceptable situation. Identity and privacy should be interoperable and distributed through the adoption of service-orientation and implementation based on open standards (technical interoperability). Ihe objective of this project is to provide a way to implement interoperable user-centric digital identity-related privacy to respond to the need of distributed nature of federated identity systems. It is recognized that technical initiatives, emerging standards and protocols are not enough to guarantee resolution for the concerns surrounding a multi-facets and complex issue of identity and privacy. For this reason they should be apprehended within a global perspective through an integrated and a multidisciplinary approach. The approach dictates that privacy law, policies, regulations and technologies are to be crafted together from the start, rather than attaching it to digital identity after the fact. Thus, we draw Digital Identity-Related Privacy (DigldeRP) requirements from global, domestic and business-specific privacy policies. The requirements take shape of business interoperability. We suggest a layered implementation framework (DigldeRP framework) in accordance to model-driven architecture (MDA) approach that would help organizations' security team to turn business interoperability into technical interoperability in the form of a set of services that could accommodate Service-Oriented Architecture (SOA): Privacy-as-a-set-of- services (PaaSS) system. DigldeRP Framework will serve as a basis for vital understanding between business management and technical managers on digital identity related privacy initiatives. The layered DigldeRP framework presents five practical layers as an ordered sequence as a basis of DigldeRP project roadmap, however, in practice, there is an iterative process to assure that each layer supports effectively and enforces requirements of the adjacent ones. Each layer is composed by a set of blocks, which determine a roadmap that security team could follow to successfully implement PaaSS. Several blocks' descriptions are based on OMG SoaML modeling language and BPMN processes description. We identified, designed and implemented seven services that form PaaSS and described their consumption. PaaSS Java QEE project), WSDL, and XSD codes are given and explained.

2011

BONAZZI Ricardo - Designing a compliance support system
BARRETO SANZ Miguel Arturo - Bio-Inspired computational cechniques applied to the clustering and visualization of spatio-temporal geospatial data

The coverage and volume of geo-referenced datasets are extensive and incessantly growing. The systematic capture of geo-referenced information generates large volumes of spatio-temporal data to be analyzed. Clustering and visualization play a key role in the exploratory data analysis and the extraction of knowledge embedded in these data. However, new challenges in visualization and clustering are posed when dealing with the special characteristics of this data. For instance, its complex structures, large quantity of samples, variables involved in a temporal context, high dimensionality and large variability in cluster shapes. The central aim of my thesis is to propose new algorithms and methodologies for clustering and visualization, in order to assist the knowledge extraction from spatiotemporal geo-referenced data, thus improving making decision processes. I present two original algorithms, one for clustering: the Fuzzy Growing Hierarchical Self-Organizing Networks (FGHSON), and the second for exploratory visual data analysis: the Tree-structured Self-organizing Maps Component Planes. In addition, I present methodologies that combined with FGHSON and the Tree-structured SOM Component Planes allow the integration of space and time seamlessly and simultaneously in order to extract knowledge embedded in a temporal context. The originality of the FGHSON lies in its capability to reflect the underlying structure of a dataset in a hierarchical fuzzy way. A hierarchical fuzzy representation of clusters is crucial when data include complex structures with large variability of cluster shapes, variances, densities and number of clusters. The most important characteristics of the FGHSON include: (1) It does not require an a-priori setup of the number of clusters. (2) The algorithm executes several self-organizing processes in parallel. Hence, when dealing with large datasets the processes can be distributed reducing the computational cost. (3) Only three parameters are necessary to set up the algorithm. In the case of the Tree-structured SOM Component Planes, the novelty of this algorithm lies in its ability to create a structure that allows the visual exploratory data analysis of large high-dimensional datasets. This algorithm creates a hierarchical structure of Self-Organizing Map Component Planes, arranging similar variables' projections in the same branches of the tree. Hence, similarities on variables' behavior can be easily detected (e.g. local correlations, maximal and minimal values and outliers). Both FGHSON and the Tree-structured SOM Component Planes were applied in several agroecological problems proving to be very efficient in the exploratory analysis and clustering of spatio-temporal datasets. In this thesis I also tested three soft competitive learning algorithms. Two of them well-known non supervised soft competitive algorithms, namely the Self-Organizing Maps (SOMs) and the Growing Hierarchical Self-Organizing Maps (GHSOMs); and the third was our original contribution, the FGHSON. Although the algorithms presented here have been used in several areas, to my knowledge there is not any work applying and comparing the performance of those techniques when dealing with spatiotemporal geospatial data, as it is presented in this thesis. I propose original methodologies to explore spatio-temporal geo-referenced datasets through time. Our approach uses time windows to capture temporal similarities and variations by using the FGHSON clustering algorithm. The developed methodologies are used in two case studies. In the first, the objective was to find similar agroecozones through time and in the second one it was to find similar environmental patterns shifted in time. Several results presented in this thesis have led to new contributions to agroecological knowledge, for instance, in sugar cane, and blackberry production. Finally, in the framework of this thesis we developed several software tools: (1) a Matlab toolbox that implements the FGHSON algorithm, and (2) a program called BIS (Bio-inspired Identification of Similar agroecozones) an interactive graphical user interface tool which integrates the FGHSON algorithm with Google Earth in order to show zones with similar agroecological characteristics.

2010

ALLANI Mouna - Tree-based message diffusion for managing replicated data in unreliable and resource-constrained peer-to-peer environment

This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc.
While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively.
To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer.
Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner.
To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms.
To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system.
At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.

BACH Christian Woldemar - Interactive epistemology and reasoning: on the toundations of game theory

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Thesis in joint-supervision with the University of Maastricht

 

REMILI Ahmed - Contribution à l’élaboration de mécanismes de détection, de contrôle et de lutte contre le blanchiment d’argent au regard de l’usage des technologies de l’information et de la communication « cas de l’Algérie »

Désormais, la lutte contre le blanchiment d'argent constitue une priorité pour les Etats et les gouvernements dans le but, d'une part, de préserver l'économie et l'intégrité des places financières et, d'autre part, de priver les organisations criminelles des ressources financières. Dans ce contexte, la préoccupation majeure des autorités algériennes en charge de la lutte contre ce phénomène est de mettre en place un dispositif capable de détecter les mécanismes de blanchiment, d'en évaluer la menace et sur la base de cette connaissance, de définir et de déployer les moyens de riposte les plus efficaces et efficients. Mais nous constatons que mener des enquêtes de blanchiment en conséquence à un crime sous-jacent a montré ses limites en matière d'établissement de preuves, d'élucidation d'affaires et de recouvrement des avoirs. Par ailleurs, nous pensons qu'il serait plus judicieux de mettre en place en amont un contrôle «systématique» des flux financiers et des opérations inhabituelles et/ou suspectes et de là, identifier d'éventuelles opérations de blanchiment, sans forcément connaître le crime initial, en veillant au maintien de l'équilibre entre le «tout sécuritaire» orienté vers la surveillance accrue des flux et la préservation de la vie privée et des libertés individuelles.

Notre thèse apporte un regard critique sur le dispositif actuel de lutte contre le blanchiment existant en Algérie que nous évaluons et sur lequel nous relevons plusieurs lacunes. Pour répondre aux problèmes identifiés nous proposons des solutions stratégiques, organisationnelles, méthodologiques et technologiques intégrées dans un cadre opérationnel cohérent au niveau national et international.

TASHI Igli - An assurance-based model to holistically assess the information security posture

Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link.

This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective.

BROUSSE Olivier - A bio-inspired agent-based programming environment for pervasive platforms

Ubiquitous Computing is the emerging trend in computing systems. Based on this observation this thesis proposes an analysis of the hardware and environmental constraints that rule pervasive platforms. These constraints have a strong impact on the programming of such platforms. Therefore solutions are proposed to facilitate this programming both at the platform and node levels.

The first contribution presented in this document proposes a combination of agentoriented programming with the principles of bio-inspiration (Phylogenesys, Ontogenesys and Epigenesys) to program pervasive platforms such as the PERvasive computing framework for modeling comPLEX virtually Unbounded Systems platform.

The second contribution proposes a method to program efficiently parallelizable applications on each computing node of this platform.

Thesis in joint-supervision with the Université de Montpellier

DARABOS Christian - Towards robust network based complex systems. From evolutionary cellular automata to biological models

Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts.

Thesis in joint-supervision with the University of Turin

GASPOZ Cédric - Prediction markets supporting technology assessment

In this thesis, we study the use of prediction markets for technology assessment. We particularly focus on their ability to assess complex issues, the design constraints required for such applications and their efficacy compared to traditional techniques. To achieve this, we followed a design science research paradigm, iteratively developing, instantiating, evaluating and refining the design of our artifacts. This allowed us to make multiple contributions, both practical and theoretical.
We first showed that prediction markets are adequate for properly assessing complex issues. We also developed a typology of design factors and design propositions for using these markets in a technology assessment context. Then, we showed that they are able to solve some issues related to the R&D portfolio management process and we proposed a roadmap for their implementation. Finally, by comparing the instantiation and the results of a multi-criteria decision method and a prediction market, we showed that the latter are more efficient, while offering similar results. We also proposed a framework for comparing forecasting methods, to identify the constraints based on contingency factors. In conclusion, our research opens a new field of application of prediction markets and should help hasten their adoption by enterprises.

MELIANE Rym - Understanding consumers repurchase in the context of online shopping: an empirical study

The use of the Internet as a shopping and purchasing medium has seen exceptional growth. However, 99% of new online businesses fail. Most online buyers do not comeback for a repurchase, and 60% abandon their shopping cart before checkout. Indeed, after the first purchase, online consumer retention becomes critical to the success of the e-commerce vendor. Retaining existing customers can save costs, increase profits, and is a means of gaining competitive advantage.

Past research identified loyalty as the most important factor in achieving customer retention, and commitment as one of the most important factors in relationship marketing, providing a good description of what type of thinking leads to loyalty. Yet, we could not find an e-commerce study investing the impact of both online loyalty and online commitment on online repurchase. One of the advantages of online shopping is the ability of browsing for the best price with one click. Yet, we could not find an e- commerce empirical research investigating the impact of post-purchase price perception on online repurchase.The objective of this research is to develop a theoretical model aimed at understanding online repurchase, or purchase continuance from the same online store.

Our model was tested in a real e-commerce context with an overall sample of 1, 866 real online buyers from the same online store.The study focuses on repurchase. Therefore, randomly selected respondents had purchased from the online store at least once prior to the survey. Five months later, we tracked respondents to see if they actually came back for a repurchase.

Our findings show that online Intention to repurchase has a non-significant impact on online Repurchase. Online post-purchase Price perception and online Normative Commitment have a non-significant impact on online Intention to repurchase, whereas online Affective Commitment, online Attitudinal Loyalty, online Behavioral Loyalty, and online Calculative Commitment have a positive impact on online Intention to repurchase. Furthermore, online Attitudinal Loyalty partially mediates between online Affective Commitment and online Intention to repurchase, and online Behavioral Loyalty partially mediates between online Attitudinal Loyalty and online Intention to repurchase.

We conducted two follow up analyses: 1) On a sample of first time buyers, we find that online post-purchase Price perception has a positive impact on Intention. 2) We divided the main study's sample into Swiss-French and Swiss-German repeated buyers. Results show that Swiss-French show more emotions when shopping online than Swiss- Germans. Our findings contribute to academic research but also to practice.

SATIZABAL MEJIA Hector Fabrio - Using biological inspiration to perform incremental modelling tasks
PESTELACCI Enea - Emergence of cooperation on static and dynamic network

Game theory is a branch of applied mathematics used to analyze situation where two or more agents are interacting. Originally it was developed as a model for conflicts and collaborations between rational and intelligent individuals. Now it finds applications in social sciences, eco- nomics, biology (particularly evolutionary biology and ecology), engineering, political science, international relations, computer science, and philosophy. Networks are an abstract representation of interactions, dependencies or relationships. Net- works are extensively used in all the fields mentioned above and in many more. Many useful informations about a system can be discovered by analyzing the current state of a network representation of such system. In this work we will apply some of the methods of game theory to populations of agents that are interconnected. A population is in fact represented by a network of players where one can only interact with another if there is a connection between them. In the first part of this work we will show that the structure of the underlying network has a strong influence on the strategies that the players will decide to adopt to maximize their utility. We will then introduce a supplementary degree of freedom by allowing the structure of the population to be modified along the simulations. This modification allows the players to modify the structure of their environment to optimize the utility that they can obtain.

FACCHINI Alessandro - A study on the epxressive power of some fragments of the modal mu-calculus

Le μ-calcul est une extension de la logique modale par des opérateurs de point fixe. Dans ce travail nous étudions la complexité de certains fragments de cette logique selon deux points de vue, différents mais étroitement liés: l'un syntaxique (ou combinatoire) et l'autre topologique. Du point de vue syn¬taxique, les propriétés définissables dans ce formalisme sont classifiées selon la complexité combinatoire des formules de cette logique, c'est-à-dire selon le nombre d'alternances des opérateurs de point fixe. Comparer deux ensembles de modèles revient ainsi à comparer la complexité syntaxique des formules as¬sociées. Du point de vue topologique, les propriétés définissables dans cette logique sont comparées à l'aide de réductions continues ou selon leurs positions dans la hiérarchie de Borel ou dans celle projective.

Thesis in joint-supervision with the Université Bordeaux-I, France

HENDAOUI Adel - Consumer shopping behavior in 3D virtual worlds: a theoretical and empirical investigation

2009

MINIAOUI Sami - Indexation et annotation pour améliorer le partage des documents

Le partage et la réutilisation d'objets d'apprentissage est encore une utopie. La mise en commun de documents pédagogiques et leur adaptation à différents contextes ont fait l'objet de très nombreux travaux. L'un des aspects qui fait problème concerne leur description qui se doit d'être aussi précise que possible afin d'en faciliter la gestion et plus spécifiquement un accès ciblé. Cette description s'effectue généralement par l'instanciation d'un ensemble de descripteurs standardisés ou métadonnées (LOM, ARIADNE, DC, etc). Force est de constater que malgré l'existence de ces standards, dont certains sont relativement peu contraignants, peu de pédagogues ou d'auteurs se prêtent à cet exercice qui reste lourd et peu gratifiant. Nous sommes partis de l'idée que si l'indexation pouvait être réalisée automatiquement avec un bon degré d'exactitude, une partie de la solution serait trouvée. Pour ce, nous nous sommes tout d'abord penché sur l'analyse des facteurs bloquants de la génération manuelle effectuée par les ingénieurs pédagogiques de l'Université de Lausanne. La complexité de ces facteurs (humains et techniques) nous a conforté dans l'idée que la génération automatique de métadonnées était bien de nature à contourner les difficultés identifiées. Nous avons donc développé une application de génération automatique de métadonnées laquelle se focalise sur le contenu comme source unique d'extraction. Une analyse en profondeur des résultats obtenus, nous a permis de constater que : - Pour les documents non structurés : notre application présente des résultats satisfaisants en se basant sur les indicateurs de mesure de qualité des métadonnées (complétude, précision, consistance logique et cohérence). - Pour des documents structurés : la génération automatique s'est révélée peu satisfaisante dans la mesure où elle ne permet pas d'exploiter les éléments sémantiques (structure, annotations) qu'ils contiennent. Et dans ce cadre nous avons pensé qu'il était possible de faire mieux. C'est ainsi que nous avons poursuivi nos travaux afin de proposer une deuxième application tirant profit du potentiel des documents structurés et des langages de transformation (XSLT) qui s'y rapportent pour améliorer la recherche dans ces documents. Cette dernière exploite la totalité des éléments sémantiques (structure, annotations) et constitue une autre alternative à la recherche basée sur les métadonnées. De plus, la recherche basée sur les annotations et la structure offre comme avantage supplémentaire de permettre de retrouver, non seulement les documents eux-mêmes, mais aussi des parties de documents. Cette caractéristique apporte une amélioration considérable par rapport à la recherche par métadonnées qui ne donne accès qu'à des documents entiers. En conclusion nous montrerons, à travers des exemples appropriés, que selon le type de document : il est possible de procéder automatiquement à leur indexation pour faciliter la recherche de documents dès lors qu'il s'agit de documents non structurés ou d'exploiter directement leur contenu sémantique dès lors qu'il s'agit de documents structurés.

FANG Gang - Firm's network capability and innovation performance: evidences from China hi-tech industry

A firm's competitive advantage can arise from internal resources as well as from an interfirm network. -This dissertation investigates the competitive advantage of a firm involved in an innovation network by integrating strategic management theory and social network theory. It develops theory and provides empirical evidence that illustrates how a networked firm enables the network value and appropriates this value in an optimal way according to its strategic purpose. The four inter-related essays in this dissertation provide a framework that sheds light on the extraction of value from an innovation network by managing and designing the network in a proactive manner.

SONNA MOMO Lambert - Elaboration de tableaux de bord SSI dynamiques: une approche à base d'ontologies
HOLZER Adrian - Modular support for developing mobile ad hoc applications

This PhD thesis addresses the issue of alleviating the burden of developing ad hoc applications. Such applications have the particularity of running on mobile devices, communicating in a peer-to-peer manner and implement some proximity-based semantics. A typical example of such application can be a radar application where users see their avatar as well as the avatars of their friends on a map on their mobile phone. Such application become increasingly popular with the advent of the latest generation of mobile smart phones with their impressive computational power, their peer-to-peer communication capabilities and their location detection technology. Unfortunately, the existing programming support for such applications is limited, hence the need to address this issue in order to alleviate their development burden.

This thesis specifically tackles this problem by providing several tools for application development support. First, it provides the location-based publish/subscribe service (LPSS), a communication abstraction, which elegantly captures recurrent communication issues and thus allows to dramatically reduce the code complexity. LPSS is implemented in a modular manner in order to be able to target two different network architectures. One pragmatic implementation is aimed at mainstream infrastructure-based mobile networks, where mobile devices can communicate through fixed antennas. The other fully decentralized implementation targets emerging mobile ad hoc networks (MANETs), where no fixed infrastructure is available and communication can only occur in a peer-to-peer fashion. For each of these architectures, various implementation strategies tailored for different application scenarios that can be parametrized at deployment time. Second, this thesis provides two location-based message diffusion protocols, namely 6Shot broadcast and 6Shot multicast, specifically aimed at MANETs and fine tuned to be used as building blocks for LPSS. Finally this thesis proposes Phomo, a phone motion testing tool that allows to test proximity semantics of ad hoc applications without having to move around with mobile devices. These different developing support tools have been packaged in a coherent middleware framework called Pervaho.

ROCHAT Denis - Réseaux complexes non homogènes appliqués aux réseaux mobiles ad'hoc
MOTTIER-LESZNER Ewa - Evaluation multidimensionnelle de l'impact des systèmes d'information sur l'organisation des PME

2008

MARIETHOZ JUSSUPOVA Yelena - Design of the corporate memory based on the intellectual capital monitoring
MADHOUR Hend - Modèle et application pour la génération automatique de parcours d'apprentissage personnalisés à partir de sources de confiance

Lors d'une recherche d'information, l'apprenant est très souvent confronté à des problèmes de guidage et de personnalisation. Ceux-ci sont d'autant plus importants que la recherche se fait dans un environnement ouvert tel que le Web. En effet, dans ce cas, il n'y a actuellement pas de contrôle de pertinence sur les ressources proposées pas plus que sur l'adéquation réelle aux besoins spécifiques de l'apprenant.
A travers l'étude de l'état de l'art, nous avons constaté l'absence d'un modèle de référence qui traite des problématiques liées (i) d'une part aux ressources d'apprentissage notamment à l'hétérogénéité de la structure et de la description et à la protection en terme de droits d'auteur et (ii) d'autre part à l'apprenant en tant qu'utilisateur notamment l'acquisition des éléments le caractérisant et la stratégie d'adaptation à lui offrir.
Notre objectif est de proposer un système adaptatif à base de ressources d'apprentissage issues d'un environnement à ouverture contrôlée. Celui-ci permet de générer automatiquement sans l'intervention d'un expert pédagogue un parcours d'apprentissage personnalisé à partir de ressources rendues disponibles par le biais de sources de confiance.
L'originalité de notre travail réside dans la proposition d'un modèle de référence dit de Lausanne qui est basé sur ce que nous considérons comme étant les meilleures pratiques des communautés : (i) du Web en terme de moyens d'ouverture, (ii) de l'hypermédia adaptatif en terme de stratégie d'adaptation et (iii) de l'apprentissage à distance en terme de manipulation des ressources d'apprentissage.
Dans notre modèle, la génération des parcours personnalisés se fait sur la base (i) de ressources d'apprentissage indexées et dont le degré de granularité en favorise le partage et la réutilisation. Les sources de confiance utilisées en garantissent l'utilité et la qualité.
(ii) de caractéristiques de l'utilisateur, compatibles avec les standards existants, permettant le passage de l'apprenant d'un environnement à un autre.
(iii) d'une adaptation à la fois individuelle et sociale.
Pour cela, le modèle de Lausanne propose :
(i) d'utiliser ISO/MLR (Metadata for Learning Resources) comme formalisme de description.
(ii) de décrire le modèle d'utilisateur avec XUN1 (eXtended User Model), notre proposition d'un modèle compatible avec les standards IEEE/PAPI et IMS/LIP.
(iii) d'adapter l'algorithme des fourmis au contexte de l'apprentissage à distance afin de générer des parcours personnalisés. La dimension individuelle est aussi prise en compte par la mise en correspondance de MLR et de XUM.
Pour valider notre modèle, nous avons développé une application et testé plusieurs scenarii mettant en action des utilisateurs différents à des moments différents. Nous avons ensuite procédé à des comparaisons entre ce que retourne le système et ce que suggère l'expert. Les résultats s'étant avérés satisfaisants dans la mesure où à chaque fois le système retourne un parcours semblable à celui qu'aurait proposé l'expert, nous sommes confortées dans notre approche.

2007

SFAXI Mohamed Ali - Improving telecommunication security level by integrating quantum key distribution in communication protocols

Classical cryptography is based on mathematical functions. The robustness of a cryptosystem essentially depends on the difficulty of computing the inverse of its one-way function. There is no mathematical proof that establishes whether it is impossible to find the inverse of a given one-way function. Therefore, it is mandatory to use a cryptosystem whose security is scientifically proven (especially for banking, governments, etc.). On the other hand, the security of quantum cryptography can be formally demonstrated. In fact, its security is based on the laws of physics that assure the unconditional security. How is it possible to use and integrate quantum cryptography into existing solutions?

This thesis proposes a method to integrate quantum cryptography into existing communication protocols like PPP, IPSec and the 802.l1i protocol. It sketches out some possible scenarios in order to prove the feasibility and to estimate the cost of such scenarios. Directives and checkpoints are given to help in certifying quantum cryptography solutions according to Common Criteria.

FOVEAU Charles-Emmanuel - Référentiels des compétences et des métiers: une approche ontologique

La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences.
Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ».
Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel.
Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences.
Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières.
Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser.
Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV.

ONDRUS Jan - A design science approach to support the assessment of disruptive technology: the Swiss mobile payment case

In this thesis we present the design of a systematic integrated computer-based approach for detecting potential disruptions from an industry perspective. Following the design science paradigm, we iteratively develop several multi-actor multi-criteria artifacts dedicated to environment scanning. The contributions of this thesis are both theoretical and practical. We demonstrate the successful use of multi-criteria decision-making methods for technology foresight. Furthermore, we illustrate the design of our artifacts using build and-evaluate loops supported with a field study of the Swiss mobile payment industry. To increase the relevance of this study, we systematically interview key Swiss experts for each design iteration. As a result, our research provides a realistic picture of the current situation in the Swiss mobile payment market and reveals previously undiscovered weak signals for future trends. Finally, we suggest a generic design process for environment scanning.

FERNANDES Emmnanuel - Ingénieur pédagogique et démarche projet : facteurs clés de succès pour l'intégration des technologies dans la pratique enseignante ?

Les gouvernements des pays occidentaux ont dépensé des sommes importantes pour faciliter l'intégration des technologies de l'information et de la communication dans l'enseignement espérant trouver une solution économique à l'épineuse équation que l'on pourrait résumer par la célèbre formule " faire plus et mieux avec moins ". Cependant force est de constater que, malgré ces efforts et la très nette amélioration de la qualité de service des infrastructures, cet objectif est loin d'être atteint. Si nous pensons qu'il est illusoire d'attendre et d'espérer que la technologie peut et va, à elle seule, résoudre les problèmes de qualité de l'enseignement, nous croyons néanmoins qu'elle peut contribuer à améliorer les conditions d'apprentissage et participer de la réflexion pédagogique que tout enseignant devrait conduire avant de dispenser ses enseignements. Dans cette optique, et convaincu que la formation à distance offre des avantages non négligeables à condition de penser " autrement " l'enseignement, nous nous sommes intéressé à la problématique du développement de ce type d'applications qui se situent à la frontière entre les sciences didactiques, les sciences cognitives, et l'informatique. Ainsi, et afin de proposer une solution réaliste et simple permettant de faciliter le développement, la mise-à-jour, l'insertion et la pérennisation des applications de formation à distance, nous nous sommes impliqué dans des projets concrets. Au fil de notre expérience de terrain nous avons fait le constat que (i)la qualité des modules de formation flexible et à distance reste encore très décevante, entre autres parce que la valeur ajoutée que peut apporter l'utilisation des technologies n'est, à notre avis, pas suffisamment exploitée et que (ii)pour réussir tout projet doit, outre le fait d'apporter une réponse utile à un besoin réel, être conduit efficacement avec le soutien d'un " champion ". Dans l'idée de proposer une démarche de gestion de projet adaptée aux besoins de la formation flexible et à distance, nous nous sommes tout d'abord penché sur les caractéristiques de ce type de projet. Nous avons ensuite analysé les méthodologies de projet existantes dans l'espoir de pouvoir utiliser l'une, l'autre ou un panachage adéquat de celles qui seraient les plus proches de nos besoins. Nous avons ensuite, de manière empirique et par itérations successives, défini une démarche pragmatique de gestion de projet et contribué à l'élaboration de fiches d'aide à la décision facilitant sa mise en oeuvre. Nous décrivons certains de ses acteurs en insistant particulièrement sur l'ingénieur pédagogique que nous considérons comme l'un des facteurs clé de succès de notre démarche et dont la vocation est de l'orchestrer. Enfin, nous avons validé a posteriori notre démarche en revenant sur le déroulement de quatre projets de FFD auxquels nous avons participé et qui sont représentatifs des projets que l'on peut rencontrer dans le milieu universitaire. En conclusion nous pensons que la mise en oeuvre de notre démarche, accompagnée de la mise à disposition de fiches d'aide à la décision informatisées, constitue un atout important et devrait permettre notamment de mesurer plus aisément les impacts réels des technologies (i) sur l'évolution de la pratique des enseignants, (ii) sur l'organisation et (iii) sur la qualité de l'enseignement. Notre démarche peut aussi servir de tremplin à la mise en place d'une démarche qualité propre à la FFD. D'autres recherches liées à la réelle flexibilisation des apprentissages et aux apports des technologies pour les apprenants pourront alors être conduites sur la base de métriques qui restent à définir.

RICKEBUSCH Ian - On solving fair exchange and related distributed problems in byzantine environments

The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others.
After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes.
The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display.
Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

ROSSI Mathias - Identification et qualification des compétences d'entreprise
CABESSA Jérémie - A game theoretical approach to the algebraic counterpart of the Wagner hierarchy

The Wagner hierarchy is known so far to be the most refined topological classification of ω-rational languages. Also, the algebraic study of formal languages shows that these ω-rational sets correspond precisely to the languages recognizable by finite pointed ω-semigroups. Within this framework, we provide a construction of the algebraic counterpart of the Wagner hierarchy. We adopt a hierarchical game approach, by translating the Wadge theory from the ω-rational language to the ω-semigroup context. More precisely, we first show that the Wagner degree is indeed a syntactic invariant. We then define a reduction relation on finite pointed ω-semigroups by means of a Wadge-like infinite two-player game. The collection of these algebraic structures ordered by this reduction is then proven to be isomorphic to the Wagner hierarchy, namely a well-founded and decidable partial ordering of width 2 and height $\omega^\omega$. We also describe a decidability procedure of this hierarchy: we introduce a graph representation of finite pointed ω-semigroups allowing to compute their precise Wagner degrees. The Wagner degree of every ω-rational language can therefore be computed directly on its syntactic image. We then show how to build a finite pointed ω-semigroup of any given Wagner degree. We finally describe the algebraic invariants characterizing every Wagner degree of this hierarchy.

Thesis in joint-supervision with the Université Paris-Diderot (Paris 7)

2006

CAMPONOVO Giovanni - Conceptual models for designing information systems supporting the strategic analysis of technology environments

Chapter I presents the motivations of this dissertation by illustrating two gaps in the current body of knowledge that are worth filling, describes the research problem addressed by this thesis and presents the research methodology used to achieve this goal.
Chapter 2 shows a review of the existing literature showing that environment analysis is a vital strategic task, that it shall be supported by adapted information systems, and that there is thus a need for developing a conceptual model of the environment that provides a reference framework for better integrating the various existing methods and a more formal definition of the various aspect to support the development of suitable tools.
Chapter 3 proposes a conceptual model that specifies the various enviromnental aspects that are relevant for strategic decision making, how they relate to each other, and ,defines them in a more formal way that is more suited for information systems development.
Chapter 4 is dedicated to the evaluation of the proposed model on the basis of its application to a concrete environment to evaluate its suitability to describe the current conditions and potential evolution of a real environment and get an idea of its usefulness.
Chapter 5 goes a step further by assembling a toolbox describing a set of methods that can be used to analyze the various environmental aspects put forward by the model and by providing more detailed specifications for a number of them to show how our model can be used to facilitate their implementation as software tools.
Chapter 6 describes a prototype of a strategic decision support tool that allow the analysis of some of the aspects of the environment that are not well supported by existing tools and namely to analyze the relationship between multiple actors and issues. The usefulness of this prototype is evaluated on the basis of its application to a concrete environment.
Chapter 7 finally concludes this thesis by making a summary of its various contributions and by proposing further interesting research directions.

TOP ^

Follow us:        
Association of MBAs (AMBA)EFMD EQUIS AccreditedQuantitative Techniques for Economics & Management (QTEM)Principles for Responsible Management Education (PRME)