Previous years

2012 | 2011 | 2010 | 2009 | 2008 | 2007 | 2006
 

2012

MOHAN Kunal - Understanding the acceptance and usage of project management methodologies
DELGADO Carlos - Behavioural queueing: cellular automata and a laboratory experiment

Queuing is a fact of life that we witness daily. We all have had the experience of waiting in line for some reason and we also know that it is an annoying situation. As the adage says "time is money"; this is perhaps the best way of stating what queuing problems mean for customers. Human beings are not very tolerant, but they are even less so when having to wait in line for service. Banks, roads, post offices and restaurants are just some examples where people must wait for service.

Studies of queuing phenomena have typically addressed the optimisation of performance measures (e.g. average waiting time, queue length and server utilisation rates) and the analysis of equilibrium solutions. The individual behaviour of the agents involved in queueing systems and their decision making process have received little attention. Although this work has been useful to improve the efficiency of many queueing systems, or to design new processes in social and physical systems, it has only provided us with a limited ability to explain the behaviour observed in many real queues.

In this dissertation we differ from this traditional research by analysing how the agents involved in the system make decisions instead of focusing on optimising performance measures or analysing an equilibrium solution. This dissertation builds on and extends the framework proposed by van Ackere and Larsen (2004) and van Ackere et al. (2010). We focus on studying behavioural aspects in queueing systems and incorporate this still underdeveloped framework into the operations management field.

BEN AYED Ghazi - Architecting User-Centric Privacy-as-a-Set-of-Services: Digital Identity Related Privacy Framework

Digitalization gives to the Internet the power by allowing several virtual representations of reality, including that of identity. We leave an increasingly digital footprint in cyberspace and this situation puts our identity at high risks. Privacy is a right and fundamental social value that could play a key role as a medium to secure digital identities. Identity functionality is increasingly delivered as sets of services, rather than monolithic applications. So, an identity layer in which identity and privacy management services are loosely coupled, publicly hosted and available to on-demand calls could be more realistic and an acceptable situation. Identity and privacy should be interoperable and distributed through the adoption of service-orientation and implementation based on open standards (technical interoperability). Ihe objective of this project is to provide a way to implement interoperable user-centric digital identity-related privacy to respond to the need of distributed nature of federated identity systems. It is recognized that technical initiatives, emerging standards and protocols are not enough to guarantee resolution for the concerns surrounding a multi-facets and complex issue of identity and privacy. For this reason they should be apprehended within a global perspective through an integrated and a multidisciplinary approach. The approach dictates that privacy law, policies, regulations and technologies are to be crafted together from the start, rather than attaching it to digital identity after the fact. Thus, we draw Digital Identity-Related Privacy (DigldeRP) requirements from global, domestic and business-specific privacy policies. The requirements take shape of business interoperability. We suggest a layered implementation framework (DigldeRP framework) in accordance to model-driven architecture (MDA) approach that would help organizations' security team to turn business interoperability into technical interoperability in the form of a set of services that could accommodate Service-Oriented Architecture (SOA): Privacy-as-a-set-of- services (PaaSS) system. DigldeRP Framework will serve as a basis for vital understanding between business management and technical managers on digital identity related privacy initiatives. The layered DigldeRP framework presents five practical layers as an ordered sequence as a basis of DigldeRP project roadmap, however, in practice, there is an iterative process to assure that each layer supports effectively and enforces requirements of the adjacent ones. Each layer is composed by a set of blocks, which determine a roadmap that security team could follow to successfully implement PaaSS. Several blocks' descriptions are based on OMG SoaML modeling language and BPMN processes description. We identified, designed and implemented seven services that form PaaSS and described their consumption. PaaSS Java QEE project), WSDL, and XSD codes are given and explained.

2011

BONAZZI Ricardo - Designing a compliance support system
BARRETO SANZ Miguel Arturo - Bio-Inspired computational cechniques applied to the clustering and visualization of spatio-temporal geospatial data

The coverage and volume of geo-referenced datasets are extensive and incessantly growing. The systematic capture of geo-referenced information generates large volumes of spatio-temporal data to be analyzed. Clustering and visualization play a key role in the exploratory data analysis and the extraction of knowledge embedded in these data. However, new challenges in visualization and clustering are posed when dealing with the special characteristics of this data. For instance, its complex structures, large quantity of samples, variables involved in a temporal context, high dimensionality and large variability in cluster shapes. The central aim of my thesis is to propose new algorithms and methodologies for clustering and visualization, in order to assist the knowledge extraction from spatiotemporal geo-referenced data, thus improving making decision processes. I present two original algorithms, one for clustering: the Fuzzy Growing Hierarchical Self-Organizing Networks (FGHSON), and the second for exploratory visual data analysis: the Tree-structured Self-organizing Maps Component Planes. In addition, I present methodologies that combined with FGHSON and the Tree-structured SOM Component Planes allow the integration of space and time seamlessly and simultaneously in order to extract knowledge embedded in a temporal context. The originality of the FGHSON lies in its capability to reflect the underlying structure of a dataset in a hierarchical fuzzy way. A hierarchical fuzzy representation of clusters is crucial when data include complex structures with large variability of cluster shapes, variances, densities and number of clusters. The most important characteristics of the FGHSON include: (1) It does not require an a-priori setup of the number of clusters. (2) The algorithm executes several self-organizing processes in parallel. Hence, when dealing with large datasets the processes can be distributed reducing the computational cost. (3) Only three parameters are necessary to set up the algorithm. In the case of the Tree-structured SOM Component Planes, the novelty of this algorithm lies in its ability to create a structure that allows the visual exploratory data analysis of large high-dimensional datasets. This algorithm creates a hierarchical structure of Self-Organizing Map Component Planes, arranging similar variables' projections in the same branches of the tree. Hence, similarities on variables' behavior can be easily detected (e.g. local correlations, maximal and minimal values and outliers). Both FGHSON and the Tree-structured SOM Component Planes were applied in several agroecological problems proving to be very efficient in the exploratory analysis and clustering of spatio-temporal datasets. In this thesis I also tested three soft competitive learning algorithms. Two of them well-known non supervised soft competitive algorithms, namely the Self-Organizing Maps (SOMs) and the Growing Hierarchical Self-Organizing Maps (GHSOMs); and the third was our original contribution, the FGHSON. Although the algorithms presented here have been used in several areas, to my knowledge there is not any work applying and comparing the performance of those techniques when dealing with spatiotemporal geospatial data, as it is presented in this thesis. I propose original methodologies to explore spatio-temporal geo-referenced datasets through time. Our approach uses time windows to capture temporal similarities and variations by using the FGHSON clustering algorithm. The developed methodologies are used in two case studies. In the first, the objective was to find similar agroecozones through time and in the second one it was to find similar environmental patterns shifted in time. Several results presented in this thesis have led to new contributions to agroecological knowledge, for instance, in sugar cane, and blackberry production. Finally, in the framework of this thesis we developed several software tools: (1) a Matlab toolbox that implements the FGHSON algorithm, and (2) a program called BIS (Bio-inspired Identification of Similar agroecozones) an interactive graphical user interface tool which integrates the FGHSON algorithm with Google Earth in order to show zones with similar agroecological characteristics.

2010

ALLANI Mouna - Tree-based message diffusion for managing replicated data in unreliable and resource-constrained peer-to-peer environment

This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc.
While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively.
To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer.
Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner.
To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms.
To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system.
At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.

BACH Christian Woldemar - Interactive epistemology and reasoning: on the toundations of game theory

Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.

Thesis in joint-supervision with the University of Maastricht

 

REMILI Ahmed - Contribution à l’élaboration de mécanismes de détection, de contrôle et de lutte contre le blanchiment d’argent au regard de l’usage des technologies de l’information et de la communication « cas de l’Algérie »

Désormais, la lutte contre le blanchiment d'argent constitue une priorité pour les Etats et les gouvernements dans le but, d'une part, de préserver l'économie et l'intégrité des places financières et, d'autre part, de priver les organisations criminelles des ressources financières. Dans ce contexte, la préoccupation majeure des autorités algériennes en charge de la lutte contre ce phénomène est de mettre en place un dispositif capable de détecter les mécanismes de blanchiment, d'en évaluer la menace et sur la base de cette connaissance, de définir et de déployer les moyens de riposte les plus efficaces et efficients. Mais nous constatons que mener des enquêtes de blanchiment en conséquence à un crime sous-jacent a montré ses limites en matière d'établissement de preuves, d'élucidation d'affaires et de recouvrement des avoirs. Par ailleurs, nous pensons qu'il serait plus judicieux de mettre en place en amont un contrôle «systématique» des flux financiers et des opérations inhabituelles et/ou suspectes et de là, identifier d'éventuelles opérations de blanchiment, sans forcément connaître le crime initial, en veillant au maintien de l'équilibre entre le «tout sécuritaire» orienté vers la surveillance accrue des flux et la préservation de la vie privée et des libertés individuelles.

Notre thèse apporte un regard critique sur le dispositif actuel de lutte contre le blanchiment existant en Algérie que nous évaluons et sur lequel nous relevons plusieurs lacunes. Pour répondre aux problèmes identifiés nous proposons des solutions stratégiques, organisationnelles, méthodologiques et technologiques intégrées dans un cadre opérationnel cohérent au niveau national et international.

TASHI Igli - An assurance-based model to holistically assess the information security posture

Evaluating Information Security Posture within an organization is becoming a very complex task. Currently, the evaluation and assessment of Information Security are commonly performed using frameworks, methodologies and standards which often consider the various aspects of security independently. Unfortunately this is ineffective because it does not take into consideration the necessity of having a global and systemic multidimensional approach to Information Security evaluation. At the same time the overall security level is globally considered to be only as strong as its weakest link.

This thesis proposes a model aiming to holistically assess all dimensions of security in order to minimize the likelihood that a given threat will exploit the weakest link. A formalized structure taking into account all security elements is presented; this is based on a methodological evaluation framework in which Information Security is evaluated from a global perspective.

BROUSSE Olivier - A bio-inspired agent-based programming environment for pervasive platforms

Ubiquitous Computing is the emerging trend in computing systems. Based on this observation this thesis proposes an analysis of the hardware and environmental constraints that rule pervasive platforms. These constraints have a strong impact on the programming of such platforms. Therefore solutions are proposed to facilitate this programming both at the platform and node levels.

The first contribution presented in this document proposes a combination of agentoriented programming with the principles of bio-inspiration (Phylogenesys, Ontogenesys and Epigenesys) to program pervasive platforms such as the PERvasive computing framework for modeling comPLEX virtually Unbounded Systems platform.

The second contribution proposes a method to program efficiently parallelizable applications on each computing node of this platform.

Thesis in joint-supervision with the Université de Montpellier

DARABOS Christian - Towards robust network based complex systems. From evolutionary cellular automata to biological models

Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts.

Thesis in joint-supervision with the University of Turin

GASPOZ Cédric - Prediction markets supporting technology assessment

In this thesis, we study the use of prediction markets for technology assessment. We particularly focus on their ability to assess complex issues, the design constraints required for such applications and their efficacy compared to traditional techniques. To achieve this, we followed a design science research paradigm, iteratively developing, instantiating, evaluating and refining the design of our artifacts. This allowed us to make multiple contributions, both practical and theoretical.
We first showed that prediction markets are adequate for properly assessing complex issues. We also developed a typology of design factors and design propositions for using these markets in a technology assessment context. Then, we showed that they are able to solve some issues related to the R&D portfolio management process and we proposed a roadmap for their implementation. Finally, by comparing the instantiation and the results of a multi-criteria decision method and a prediction market, we showed that the latter are more efficient, while offering similar results. We also proposed a framework for comparing forecasting methods, to identify the constraints based on contingency factors. In conclusion, our research opens a new field of application of prediction markets and should help hasten their adoption by enterprises.

MELIANE Rym - Understanding consumers repurchase in the context of online shopping: an empirical study

The use of the Internet as a shopping and purchasing medium has seen exceptional growth. However, 99% of new online businesses fail. Most online buyers do not comeback for a repurchase, and 60% abandon their shopping cart before checkout. Indeed, after the first purchase, online consumer retention becomes critical to the success of the e-commerce vendor. Retaining existing customers can save costs, increase profits, and is a means of gaining competitive advantage.

Past research identified loyalty as the most important factor in achieving customer retention, and commitment as one of the most important factors in relationship marketing, providing a good description of what type of thinking leads to loyalty. Yet, we could not find an e-commerce study investing the impact of both online loyalty and online commitment on online repurchase. One of the advantages of online shopping is the ability of browsing for the best price with one click. Yet, we could not find an e- commerce empirical research investigating the impact of post-purchase price perception on online repurchase.The objective of this research is to develop a theoretical model aimed at understanding online repurchase, or purchase continuance from the same online store.

Our model was tested in a real e-commerce context with an overall sample of 1, 866 real online buyers from the same online store.The study focuses on repurchase. Therefore, randomly selected respondents had purchased from the online store at least once prior to the survey. Five months later, we tracked respondents to see if they actually came back for a repurchase.

Our findings show that online Intention to repurchase has a non-significant impact on online Repurchase. Online post-purchase Price perception and online Normative Commitment have a non-significant impact on online Intention to repurchase, whereas online Affective Commitment, online Attitudinal Loyalty, online Behavioral Loyalty, and online Calculative Commitment have a positive impact on online Intention to repurchase. Furthermore, online Attitudinal Loyalty partially mediates between online Affective Commitment and online Intention to repurchase, and online Behavioral Loyalty partially mediates between online Attitudinal Loyalty and online Intention to repurchase.

We conducted two follow up analyses: 1) On a sample of first time buyers, we find that online post-purchase Price perception has a positive impact on Intention. 2) We divided the main study's sample into Swiss-French and Swiss-German repeated buyers. Results show that Swiss-French show more emotions when shopping online than Swiss- Germans. Our findings contribute to academic research but also to practice.

SATIZABAL MEJIA Hector Fabrio - Using biological inspiration to perform incremental modelling tasks
PESTELACCI Enea - Emergence of cooperation on static and dynamic network

Game theory is a branch of applied mathematics used to analyze situation where two or more agents are interacting. Originally it was developed as a model for conflicts and collaborations between rational and intelligent individuals. Now it finds applications in social sciences, eco- nomics, biology (particularly evolutionary biology and ecology), engineering, political science, international relations, computer science, and philosophy. Networks are an abstract representation of interactions, dependencies or relationships. Net- works are extensively used in all the fields mentioned above and in many more. Many useful informations about a system can be discovered by analyzing the current state of a network representation of such system. In this work we will apply some of the methods of game theory to populations of agents that are interconnected. A population is in fact represented by a network of players where one can only interact with another if there is a connection between them. In the first part of this work we will show that the structure of the underlying network has a strong influence on the strategies that the players will decide to adopt to maximize their utility. We will then introduce a supplementary degree of freedom by allowing the structure of the population to be modified along the simulations. This modification allows the players to modify the structure of their environment to optimize the utility that they can obtain.

FACCHINI Alessandro - A study on the epxressive power of some fragments of the modal mu-calculus

Le μ-calcul est une extension de la logique modale par des opérateurs de point fixe. Dans ce travail nous étudions la complexité de certains fragments de cette logique selon deux points de vue, différents mais étroitement liés: l'un syntaxique (ou combinatoire) et l'autre topologique. Du point de vue syn¬taxique, les propriétés définissables dans ce formalisme sont classifiées selon la complexité combinatoire des formules de cette logique, c'est-à-dire selon le nombre d'alternances des opérateurs de point fixe. Comparer deux ensembles de modèles revient ainsi à comparer la complexité syntaxique des formules as¬sociées. Du point de vue topologique, les propriétés définissables dans cette logique sont comparées à l'aide de réductions continues ou selon leurs positions dans la hiérarchie de Borel ou dans celle projective.

Thesis in joint-supervision with the Université Bordeaux-I, France

HENDAOUI Adel - Consumer shopping behavior in 3D virtual worlds: a theoretical and empirical investigation

2009

MINIAOUI Sami - Indexation et annotation pour améliorer le partage des documents

Le partage et la réutilisation d'objets d'apprentissage est encore une utopie. La mise en commun de documents pédagogiques et leur adaptation à différents contextes ont fait l'objet de très nombreux travaux. L'un des aspects qui fait problème concerne leur description qui se doit d'être aussi précise que possible afin d'en faciliter la gestion et plus spécifiquement un accès ciblé. Cette description s'effectue généralement par l'instanciation d'un ensemble de descripteurs standardisés ou métadonnées (LOM, ARIADNE, DC, etc). Force est de constater que malgré l'existence de ces standards, dont certains sont relativement peu contraignants, peu de pédagogues ou d'auteurs se prêtent à cet exercice qui reste lourd et peu gratifiant. Nous sommes partis de l'idée que si l'indexation pouvait être réalisée automatiquement avec un bon degré d'exactitude, une partie de la solution serait trouvée. Pour ce, nous nous sommes tout d'abord penché sur l'analyse des facteurs bloquants de la génération manuelle effectuée par les ingénieurs pédagogiques de l'Université de Lausanne. La complexité de ces facteurs (humains et techniques) nous a conforté dans l'idée que la génération automatique de métadonnées était bien de nature à contourner les difficultés identifiées. Nous avons donc développé une application de génération automatique de métadonnées laquelle se focalise sur le contenu comme source unique d'extraction. Une analyse en profondeur des résultats obtenus, nous a permis de constater que : - Pour les documents non structurés : notre application présente des résultats satisfaisants en se basant sur les indicateurs de mesure de qualité des métadonnées (complétude, précision, consistance logique et cohérence). - Pour des documents structurés : la génération automatique s'est révélée peu satisfaisante dans la mesure où elle ne permet pas d'exploiter les éléments sémantiques (structure, annotations) qu'ils contiennent. Et dans ce cadre nous avons pensé qu'il était possible de faire mieux. C'est ainsi que nous avons poursuivi nos travaux afin de proposer une deuxième application tirant profit du potentiel des documents structurés et des langages de transformation (XSLT) qui s'y rapportent pour améliorer la recherche dans ces documents. Cette dernière exploite la totalité des éléments sémantiques (structure, annotations) et constitue une autre alternative à la recherche basée sur les métadonnées. De plus, la recherche basée sur les annotations et la structure offre comme avantage supplémentaire de permettre de retrouver, non seulement les documents eux-mêmes, mais aussi des parties de documents. Cette caractéristique apporte une amélioration considérable par rapport à la recherche par métadonnées qui ne donne accès qu'à des documents entiers. En conclusion nous montrerons, à travers des exemples appropriés, que selon le type de document : il est possible de procéder automatiquement à leur indexation pour faciliter la recherche de documents dès lors qu'il s'agit de documents non structurés ou d'exploiter directement leur contenu sémantique dès lors qu'il s'agit de documents structurés.

FANG Gang - Firm's network capability and innovation performance: evidences from China hi-tech industry

A firm's competitive advantage can arise from internal resources as well as from an interfirm network. -This dissertation investigates the competitive advantage of a firm involved in an innovation network by integrating strategic management theory and social network theory. It develops theory and provides empirical evidence that illustrates how a networked firm enables the network value and appropriates this value in an optimal way according to its strategic purpose. The four inter-related essays in this dissertation provide a framework that sheds light on the extraction of value from an innovation network by managing and designing the network in a proactive manner.

SONNA MOMO Lambert - Elaboration de tableaux de bord SSI dynamiques: une approche à base d'ontologies
HOLZER Adrian - Modular support for developing mobile ad hoc applications

This PhD thesis addresses the issue of alleviating the burden of developing ad hoc applications. Such applications have the particularity of running on mobile devices, communicating in a peer-to-peer manner and implement some proximity-based semantics. A typical example of such application can be a radar application where users see their avatar as well as the avatars of their friends on a map on their mobile phone. Such application become increasingly popular with the advent of the latest generation of mobile smart phones with their impressive computational power, their peer-to-peer communication capabilities and their location detection technology. Unfortunately, the existing programming support for such applications is limited, hence the need to address this issue in order to alleviate their development burden.

This thesis specifically tackles this problem by providing several tools for application development support. First, it provides the location-based publish/subscribe service (LPSS), a communication abstraction, which elegantly captures recurrent communication issues and thus allows to dramatically reduce the code complexity. LPSS is implemented in a modular manner in order to be able to target two different network architectures. One pragmatic implementation is aimed at mainstream infrastructure-based mobile networks, where mobile devices can communicate through fixed antennas. The other fully decentralized implementation targets emerging mobile ad hoc networks (MANETs), where no fixed infrastructure is available and communication can only occur in a peer-to-peer fashion. For each of these architectures, various implementation strategies tailored for different application scenarios that can be parametrized at deployment time. Second, this thesis provides two location-based message diffusion protocols, namely 6Shot broadcast and 6Shot multicast, specifically aimed at MANETs and fine tuned to be used as building blocks for LPSS. Finally this thesis proposes Phomo, a phone motion testing tool that allows to test proximity semantics of ad hoc applications without having to move around with mobile devices. These different developing support tools have been packaged in a coherent middleware framework called Pervaho.

ROCHAT Denis - Réseaux complexes non homogènes appliqués aux réseaux mobiles ad'hoc
MOTTIER-LESZNER Ewa - Evaluation multidimensionnelle de l'impact des systèmes d'information sur l'organisation des PME

2008

MARIETHOZ JUSSUPOVA Yelena - Design of the corporate memory based on the intellectual capital monitoring
MADHOUR Hend - Modèle et application pour la génération automatique de parcours d'apprentissage personnalisés à partir de sources de confiance

Lors d'une recherche d'information, l'apprenant est très souvent confronté à des problèmes de guidage et de personnalisation. Ceux-ci sont d'autant plus importants que la recherche se fait dans un environnement ouvert tel que le Web. En effet, dans ce cas, il n'y a actuellement pas de contrôle de pertinence sur les ressources proposées pas plus que sur l'adéquation réelle aux besoins spécifiques de l'apprenant.
A travers l'étude de l'état de l'art, nous avons constaté l'absence d'un modèle de référence qui traite des problématiques liées (i) d'une part aux ressources d'apprentissage notamment à l'hétérogénéité de la structure et de la description et à la protection en terme de droits d'auteur et (ii) d'autre part à l'apprenant en tant qu'utilisateur notamment l'acquisition des éléments le caractérisant et la stratégie d'adaptation à lui offrir.
Notre objectif est de proposer un système adaptatif à base de ressources d'apprentissage issues d'un environnement à ouverture contrôlée. Celui-ci permet de générer automatiquement sans l'intervention d'un expert pédagogue un parcours d'apprentissage personnalisé à partir de ressources rendues disponibles par le biais de sources de confiance.
L'originalité de notre travail réside dans la proposition d'un modèle de référence dit de Lausanne qui est basé sur ce que nous considérons comme étant les meilleures pratiques des communautés : (i) du Web en terme de moyens d'ouverture, (ii) de l'hypermédia adaptatif en terme de stratégie d'adaptation et (iii) de l'apprentissage à distance en terme de manipulation des ressources d'apprentissage.
Dans notre modèle, la génération des parcours personnalisés se fait sur la base (i) de ressources d'apprentissage indexées et dont le degré de granularité en favorise le partage et la réutilisation. Les sources de confiance utilisées en garantissent l'utilité et la qualité.
(ii) de caractéristiques de l'utilisateur, compatibles avec les standards existants, permettant le passage de l'apprenant d'un environnement à un autre.
(iii) d'une adaptation à la fois individuelle et sociale.
Pour cela, le modèle de Lausanne propose :
(i) d'utiliser ISO/MLR (Metadata for Learning Resources) comme formalisme de description.
(ii) de décrire le modèle d'utilisateur avec XUN1 (eXtended User Model), notre proposition d'un modèle compatible avec les standards IEEE/PAPI et IMS/LIP.
(iii) d'adapter l'algorithme des fourmis au contexte de l'apprentissage à distance afin de générer des parcours personnalisés. La dimension individuelle est aussi prise en compte par la mise en correspondance de MLR et de XUM.
Pour valider notre modèle, nous avons développé une application et testé plusieurs scenarii mettant en action des utilisateurs différents à des moments différents. Nous avons ensuite procédé à des comparaisons entre ce que retourne le système et ce que suggère l'expert. Les résultats s'étant avérés satisfaisants dans la mesure où à chaque fois le système retourne un parcours semblable à celui qu'aurait proposé l'expert, nous sommes confortées dans notre approche.

2007

SFAXI Mohamed Ali - Improving telecommunication security level by integrating quantum key distribution in communication protocols

Classical cryptography is based on mathematical functions. The robustness of a cryptosystem essentially depends on the difficulty of computing the inverse of its one-way function. There is no mathematical proof that establishes whether it is impossible to find the inverse of a given one-way function. Therefore, it is mandatory to use a cryptosystem whose security is scientifically proven (especially for banking, governments, etc.). On the other hand, the security of quantum cryptography can be formally demonstrated. In fact, its security is based on the laws of physics that assure the unconditional security. How is it possible to use and integrate quantum cryptography into existing solutions?

This thesis proposes a method to integrate quantum cryptography into existing communication protocols like PPP, IPSec and the 802.l1i protocol. It sketches out some possible scenarios in order to prove the feasibility and to estimate the cost of such scenarios. Directives and checkpoints are given to help in certifying quantum cryptography solutions according to Common Criteria.

FOVEAU Charles-Emmanuel - Référentiels des compétences et des métiers: une approche ontologique

La mondialisation des marchés, les mutations du contexte économique et enfin l'impact des nouvelles technologies de l'information ont obligé les entreprises à revoir la façon dont elles gèrent leurs capitaux intellectuel (gestion des connaissances) et humain (gestion des compétences). II est communément admis aujourd'hui que ceux-ci jouent un rôle particulièrement stratégique dans l'organisation. L'entreprise désireuse de se lancer dans une politique gestion de ces capitaux devra faire face à différents problèmes. En effet, afin de gérer ces connaissances et ces compétences, un long processus de capitalisation doit être réalisé. Celui-ci doit passer par différentes étapes comme l'identification, l'extraction et la représentation des connaissances et des compétences.
Pour cela, il existe différentes méthodes de gestion des connaissances et des compétences comme MASK, CommonKADS, KOD... Malheureusement, ces différentes méthodes sont très lourdes à mettre en oeuvre, et se cantonnent à certains types de connaissances et sont, par conséquent, plus limitées dans les fonctionnalités qu'elles peuvent offrir. Enfin, la gestion des compétences et la gestion des connaissances sont deux domaines dissociés alors qu'il serait intéressant d'unifier ces deux approches en une seule. En effet, les compétences sont très proches des connaissances comme le souligne la définition de la compétence qui suit : « un ensemble de connaissances en action dans un contexte donné ».
Par conséquent, nous avons choisi d'appuyer notre proposition sur le concept de compétence. En effet, la compétence est parmi les connaissances de l'entreprise l'une des plus cruciales, en particulier pour éviter la perte de savoir-faire ou pour pouvoir prévenir les besoins futurs de l'entreprise, car derrière les compétences des collaborateurs, se trouve l'efficacité de l'organisation. De plus, il est possible de décrire grâce à la compétence de nombreux autres concepts de l'organisation, comme les métiers, les missions, les projets, les formations... Malheureusement, il n'existe pas réellement de consensus sur la définition de la compétence. D'ailleurs, les différentes définitions existantes, même si elles sont pleinement satisfaisantes pour les experts, ne permettent pas de réaliser un système opérationnel.
Dans notre approche; nous abordons la gestion des compétences à l'aide d'une méthode de gestion des connaissances. En effet, de par leur nature même, connaissance et compétence sont intimement liées et donc une telle méthode est parfaitement adaptée à la gestion des compétences.
Afin de pouvoir exploiter ces connaissances et ces compétences nous avons dû, dans un premier temps, définir les concepts organisationnels de façon claire et computationnelle. Sur cette base, nous proposons une méthodologie de construction des différents référentiels d'entreprise (référentiel de compétences, des missions, des métiers...). Pour modéliser ces différents référentiels, nous avons choisi l'ontologie, car elle permet d'obtenir des définitions cohérentes et consensuelles aux concepts tout en supportant les diversités langagières.
Ensuite, nous cartographions les connaissances de l'entreprise (formations, missions, métiers...) sur ces différentes ontologies afin de pouvoir les exploiter et les diffuser.
Notre approche de la gestion des connaissances et de la gestion des compétences a permis la réalisation d'un outil offrant de nombreuses fonctionnalités comme la gestion des aires de mobilités, l'analyse stratégique, les annuaires ou encore la gestion des CV.

ONDRUS Jan - A design science approach to support the assessment of disruptive technology: the Swiss mobile payment case

In this thesis we present the design of a systematic integrated computer-based approach for detecting potential disruptions from an industry perspective. Following the design science paradigm, we iteratively develop several multi-actor multi-criteria artifacts dedicated to environment scanning. The contributions of this thesis are both theoretical and practical. We demonstrate the successful use of multi-criteria decision-making methods for technology foresight. Furthermore, we illustrate the design of our artifacts using build and-evaluate loops supported with a field study of the Swiss mobile payment industry. To increase the relevance of this study, we systematically interview key Swiss experts for each design iteration. As a result, our research provides a realistic picture of the current situation in the Swiss mobile payment market and reveals previously undiscovered weak signals for future trends. Finally, we suggest a generic design process for environment scanning.

FERNANDES Emmnanuel - Ingénieur pédagogique et démarche projet : facteurs clés de succès pour l'intégration des technologies dans la pratique enseignante ?

Les gouvernements des pays occidentaux ont dépensé des sommes importantes pour faciliter l'intégration des technologies de l'information et de la communication dans l'enseignement espérant trouver une solution économique à l'épineuse équation que l'on pourrait résumer par la célèbre formule " faire plus et mieux avec moins ". Cependant force est de constater que, malgré ces efforts et la très nette amélioration de la qualité de service des infrastructures, cet objectif est loin d'être atteint. Si nous pensons qu'il est illusoire d'attendre et d'espérer que la technologie peut et va, à elle seule, résoudre les problèmes de qualité de l'enseignement, nous croyons néanmoins qu'elle peut contribuer à améliorer les conditions d'apprentissage et participer de la réflexion pédagogique que tout enseignant devrait conduire avant de dispenser ses enseignements. Dans cette optique, et convaincu que la formation à distance offre des avantages non négligeables à condition de penser " autrement " l'enseignement, nous nous sommes intéressé à la problématique du développement de ce type d'applications qui se situent à la frontière entre les sciences didactiques, les sciences cognitives, et l'informatique. Ainsi, et afin de proposer une solution réaliste et simple permettant de faciliter le développement, la mise-à-jour, l'insertion et la pérennisation des applications de formation à distance, nous nous sommes impliqué dans des projets concrets. Au fil de notre expérience de terrain nous avons fait le constat que (i)la qualité des modules de formation flexible et à distance reste encore très décevante, entre autres parce que la valeur ajoutée que peut apporter l'utilisation des technologies n'est, à notre avis, pas suffisamment exploitée et que (ii)pour réussir tout projet doit, outre le fait d'apporter une réponse utile à un besoin réel, être conduit efficacement avec le soutien d'un " champion ". Dans l'idée de proposer une démarche de gestion de projet adaptée aux besoins de la formation flexible et à distance, nous nous sommes tout d'abord penché sur les caractéristiques de ce type de projet. Nous avons ensuite analysé les méthodologies de projet existantes dans l'espoir de pouvoir utiliser l'une, l'autre ou un panachage adéquat de celles qui seraient les plus proches de nos besoins. Nous avons ensuite, de manière empirique et par itérations successives, défini une démarche pragmatique de gestion de projet et contribué à l'élaboration de fiches d'aide à la décision facilitant sa mise en oeuvre. Nous décrivons certains de ses acteurs en insistant particulièrement sur l'ingénieur pédagogique que nous considérons comme l'un des facteurs clé de succès de notre démarche et dont la vocation est de l'orchestrer. Enfin, nous avons validé a posteriori notre démarche en revenant sur le déroulement de quatre projets de FFD auxquels nous avons participé et qui sont représentatifs des projets que l'on peut rencontrer dans le milieu universitaire. En conclusion nous pensons que la mise en oeuvre de notre démarche, accompagnée de la mise à disposition de fiches d'aide à la décision informatisées, constitue un atout important et devrait permettre notamment de mesurer plus aisément les impacts réels des technologies (i) sur l'évolution de la pratique des enseignants, (ii) sur l'organisation et (iii) sur la qualité de l'enseignement. Notre démarche peut aussi servir de tremplin à la mise en place d'une démarche qualité propre à la FFD. D'autres recherches liées à la réelle flexibilisation des apprentissages et aux apports des technologies pour les apprenants pourront alors être conduites sur la base de métriques qui restent à définir.

RICKEBUSCH Ian - On solving fair exchange and related distributed problems in byzantine environments

The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others.
After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes.
The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display.
Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

ROSSI Mathias - Identification et qualification des compétences d'entreprise
CABESSA Jérémie - A game theoretical approach to the algebraic counterpart of the Wagner hierarchy

The Wagner hierarchy is known so far to be the most refined topological classification of ω-rational languages. Also, the algebraic study of formal languages shows that these ω-rational sets correspond precisely to the languages recognizable by finite pointed ω-semigroups. Within this framework, we provide a construction of the algebraic counterpart of the Wagner hierarchy. We adopt a hierarchical game approach, by translating the Wadge theory from the ω-rational language to the ω-semigroup context. More precisely, we first show that the Wagner degree is indeed a syntactic invariant. We then define a reduction relation on finite pointed ω-semigroups by means of a Wadge-like infinite two-player game. The collection of these algebraic structures ordered by this reduction is then proven to be isomorphic to the Wagner hierarchy, namely a well-founded and decidable partial ordering of width 2 and height $\omega^\omega$. We also describe a decidability procedure of this hierarchy: we introduce a graph representation of finite pointed ω-semigroups allowing to compute their precise Wagner degrees. The Wagner degree of every ω-rational language can therefore be computed directly on its syntactic image. We then show how to build a finite pointed ω-semigroup of any given Wagner degree. We finally describe the algebraic invariants characterizing every Wagner degree of this hierarchy.

Thesis in joint-supervision with the Université Paris-Diderot (Paris 7)

2006

CAMPONOVO Giovanni - Conceptual models for designing information systems supporting the strategic analysis of technology environments

Chapter I presents the motivations of this dissertation by illustrating two gaps in the current body of knowledge that are worth filling, describes the research problem addressed by this thesis and presents the research methodology used to achieve this goal.
Chapter 2 shows a review of the existing literature showing that environment analysis is a vital strategic task, that it shall be supported by adapted information systems, and that there is thus a need for developing a conceptual model of the environment that provides a reference framework for better integrating the various existing methods and a more formal definition of the various aspect to support the development of suitable tools.
Chapter 3 proposes a conceptual model that specifies the various enviromnental aspects that are relevant for strategic decision making, how they relate to each other, and ,defines them in a more formal way that is more suited for information systems development.
Chapter 4 is dedicated to the evaluation of the proposed model on the basis of its application to a concrete environment to evaluate its suitability to describe the current conditions and potential evolution of a real environment and get an idea of its usefulness.
Chapter 5 goes a step further by assembling a toolbox describing a set of methods that can be used to analyze the various environmental aspects put forward by the model and by providing more detailed specifications for a number of them to show how our model can be used to facilitate their implementation as software tools.
Chapter 6 describes a prototype of a strategic decision support tool that allow the analysis of some of the aspects of the environment that are not well supported by existing tools and namely to analyze the relationship between multiple actors and issues. The usefulness of this prototype is evaluated on the basis of its application to a concrete environment.
Chapter 7 finally concludes this thesis by making a summary of its various contributions and by proposing further interesting research directions.

TOP ^

Follow us:        
Association of MBAs (AMBA)Business School Impact System (BSIS)EFMD EQUIS AccreditedQuantitative Techniques for Economics & Management (QTEM)Principles for Responsible Management Education (PRME)