Papers by Thomas Mazzuchi
Optimizing Operations, 2021
The Department of Defense (DoD) operates a world-wide supply chain, which in 2017 contained nearl... more The Department of Defense (DoD) operates a world-wide supply chain, which in 2017 contained nearly 5 million items collectively valued at over $90 billion. Since at least 1990, designing and operating this supply chain, and adapting it to ever-changing military requirements, are highly complex and tightly coupled problems, which the highest levels of DoD recognize as weaknesses. Military supply chains face a wide range of challenges. Decisions made at the operational and tactical levels of logistics can alter the effectiveness of decisions made at the strategic level. Decisions must be made with incomplete information. As a result, practical solutions must simultaneously incorporate decisions made at all levels as well as take into account the uncertainty faced by the logistician. The design of modern military supply chains, particularly for large networks where many values are not known precisely, is recognized as too complex for many techniques found in the academic literature. Mu...
Social Science Research Network, 2022
INTED2012 Proceedings, 2012
UMI Dissertation Information Service eBooks, 1982
Natural Hazards Review, Nov 1, 2023
INTED2013 Proceedings, 2013
Literature offers little insight into the effectiveness of the Software Development Methodologies... more Literature offers little insight into the effectiveness of the Software Development Methodologies (SDM) at different life-cycle phases of development. In comparison, plan-based methodologies, such as Waterfall, Spiral, and Iterative may offer discipline and standards in development practices. However, plan-based methodologies assume that a full listing of requirements is available prior to development and invite limited interaction with the customers prior to final product delivery. The Agile process, on the other hand, as a process, embraces the "divide and conquer" technique leading to a faster product development turn around and more client involvement. This research introduces a model that extracts and empirically tests three main software development driver constructs: Facilitating Conditions, Affect, and Perceived Consequences. These constructs are derived from a well-known human-behavior model. Our model, although it can be used for evaluating the effectiveness of different software methodologies, will be used to measure the effectiveness of the Agile methodologies at requirement, design, and implementation stages. Hypothesis testing for our model favored most the use of Agile process at implementation stage and least at design. Thus, our research introduces anew Agile methodology to enhance the effect of Agile methodologies during design life-cycle stage. We use one of Agile's methodologies, SCRUM, as a base for our methodology and then inject the Design Pattern Recognition techniques and the RAD's Time-boxed concepts to form a methodology that combines the best bred of the above-mentioned concepts.
Systems Engineering, Jul 1, 2022
Assessing the impact of natural disasters and manmade incidents against critical infrastructure s... more Assessing the impact of natural disasters and manmade incidents against critical infrastructure systems is important and challenging. As critical infrastructure sectors become more interdependent, a method is needed to assess how disruptions to one sector affect other sectors. This need is particularly significant in the case of critical national supply chains which have not received the same attention as other critical infrastructure systems. This research develops a methodology for performing supply chain impact assessments by integrating model‐based systems engineering (MBSE) with discrete event simulation (DES). SysML models capture the supply chain structure and critical infrastructure interdependencies. Simulation of the supply chain model subjected to external constraints identifies the impacts from critical infrastructure disruptions. This approach enables researchers and public policy planners to assess critical supply chain risks associated with current infrastructure implementations, and to conduct what‐if analyses on alternative solutions. Application of this methodology to assess the natural disaster impacts on an urban food supply system demonstrates the effectiveness of modeling and simulation for evaluating the impacts of critical infrastructure disruptions on a wide range of critical supply chains. Future research should apply this methodology to different national supply chains and investigate additional critical infrastructure interdependency linkage mechanisms.
Systems Engineering, Feb 20, 2021
Complex systems typically have a large parameter space and by definition have the potential for s... more Complex systems typically have a large parameter space and by definition have the potential for strong emergent behavior, especially when operating in a complex environment. A method is needed to design a system that will be resilient in a complex target operating environment. Here, we propose a maximin optimization approach to support the exploration of system design, based on co‐evolutionary system/environment models, which emphasizes the importance of the operating environment in the system evolution, resulting in increased system resilience. In this paper, this method is demonstrated on an air defense system and shows an increase in resilience when compared to the system resulting from the application of a genetic algorithm such as has been previously demonstrated for other systems. For the air defense system used here, the proposed maximin optimization process significantly (, via a one‐sided paired t‐test) increased the resulting system resilience from 0.64 (the resilience of the system designed by a genetic algorithm) to 0.92 (the resilience of the system designed by maximin optimization).
Systems Engineering, Nov 11, 2021
Naval leaders are increasingly demanding resilient maritime systems capable of effectively antici... more Naval leaders are increasingly demanding resilient maritime systems capable of effectively anticipating, responding to, recovering from, and adapting to a given environmental disruption. For engineers designing these systems, real‐world data, information, and knowledge must be leveraged to operationalize the concept of resilience, particularly during the verification process of the system life cycle. However, such data are commonly unstructured, poorly organized, and lacking in context. Resilience‐defining data may not even exist, necessitating a methodological approach to ensure that relevant data are actually collected during system operation. This paper presents a feasible conceptual data model to improve the process of characterizing a given system's resilient performance. The authors consider the prevailing literature from the systems engineering, resilience engineering, and data management communities, and a scenario involving a hurricane‐impacted ocean monitoring system is presented as a case study of the conceptual data model's utility. Ultimately, by using a resilience‐centric data modeling approach, systems engineers can better derive prescribed resilience measures and metrics for a given system or system‐of‐systems.
Reliability Engineering & System Safety, Mar 1, 2021
This is a PDF file of an article that has undergone enhancements after acceptance, such as the ad... more This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Reliability Engineering & System Safety, Aug 1, 2018
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service... more This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain. Highlights Outlines "black-box" vulnerability discovery modeling (VDM) technique limitations Recaps 46 software release (SR) and security assessment profile (SAP) variables Gathers multivariate expert judgment dataset using Cooke's method Introduces "clear-box" scaled Bayesian model average (BMA) VDM technique Demonstrates scaled BMA VDM technique using several web-browsers
The international journal of business ethics and governance, Sep 30, 2021
Oil and gas projects are at high risk and the reasons can be the adoption of complex technology, ... more Oil and gas projects are at high risk and the reasons can be the adoption of complex technology, participation of different parties, etc. Oil and gas industries are often vulnerable to risks and hazards, but they overcome these problems by following tools and techniques of risk management, which results in employee and organizations safety. Based on the facts, this research report aims to evaluate and identify the risk management strategies and procedures and assess the efficiency of risk management tools in the Oil and Gas Company in the Kingdom of Bahrain. The survey was conducted among two groups using quantitative as well as qualitative methods. One hundred twenty-four participants comprising of Engineers, Superintendents, Fire and Safety Officers, HR Managers, Health and Safety Environment Officers were among the respondents of the Survey Questionnaire. For the semi-structured interview, managers from supply and marine, Operation Specialist, Acting Manager of Health, Safety and Environment (HSE), Managers of Operational Plant Department were selected. The data collected through the survey question was analyzed using statistical analysis. The data collected through the survey questions were later imported to IBM SPSS (Statistical Package for Social Science) version 23.0 and performed descriptive analysis to explain the participants' characteristics, discrete variables expressed as frequencies and percentages and continuous variables expressed as mean and SD. The reliability of the instrument was assessed using Cronbach's alpha. This research indicated that around 56 % of the engineers and majority of the participating managers agree and strongly agree that the company has the Oil and Gas Company in the Kingdom of Bahrain have has implemented several safety precautions, training, and appropriate risk management tools to ensure the safety of the employees and work to eliminate any risk which could be hazardous to life and property.
Reliability Engineering & System Safety, Mar 1, 2019
Most software vulnerabilities are preventable, but they continue to be present in software releas... more Most software vulnerabilities are preventable, but they continue to be present in software releases. When Blackhats, or malicious researchers, discover vulnerabilities, they often release corresponding exploit software and malware. Therefore, customer confidence could be reduced if vulnerabilities-or discoveries of them-are not prevented, mitigated, or addressed. In addressing this, managers must choose which alternatives will provide maximal impact and could use vulnerability discovery modeling techniques to support their decision-making process. Applications of these techniques have used traditional approaches to analysis and, despite the dearth of data, have not included information from experts. This article takes an alternative approach, applying Bayesian methods to modeling the vulnerability-discovery phenomenon. Relevant data was obtained from security experts in structured workshops and from public databases. The open-source framework, MCMCBayes, was developed to automate performing Bayesian model averaging via power-posteriors. It combines predictions of intervalgrouped discoveries by performance-weighting results from six variants of the non-homogeneous Poisson process (NHPP), two regression models, and two growth-curve models. The methodology is applicable to softwaremakers and persons interested in applications of expert-judgment elicitation or in using Bayesian analysis techniques with phenomena having non-decreasing counts over time.
Engineering management journal, Dec 1, 2012
Abstract: The Solar-Photovoltaic (PV) commercial power generation cost gap with traditional power... more Abstract: The Solar-Photovoltaic (PV) commercial power generation cost gap with traditional power can be narrowed by proposing a financing method involving a combination of a longer-term purchase power agreement and lower interest rates. We have validated this method using cash flow analysis of a mid-size (14-megawatts) California PV utility. We assess the economic feasibility of the project using sensitivity analysis, and examine the removal of the financing barrier by proposing a self-sustaining finance scheme as a possible alternative to government's subsidy that lacks an inflow of cash to offset the outflow of subsidy payments. Furthermore, we highlight the scenarios in which the PV power becomes cost competitive with conventional electricity generation.
IEEE Systems Journal, Jun 1, 2016
Around the world, renewable energy-generating systems (RES) have dramatically expanded in capacit... more Around the world, renewable energy-generating systems (RES) have dramatically expanded in capacity and in energy generated. A variety of means have driven this expansion, including mandates to meet RES-based generation targets. Much of the RES-related literature has focused on improving technical aspects of performance, reducing integration barriers, or estimating benefits from increased RES generation. Little work has considered how RES technologies could satisfy mandated utility-scale generation targets. This paper proposes a quantitative risk model to estimate the probability of meeting a national RES generation target. This paper uses as a case study the United Kingdom's steps to meet its mandate under the European Union's 2009 Renewable Energy Directive. This paper presents the first part of this study by introducing the concept, the target, the approach, the data sources, and the resulting scenarios. This paper describes the model's assumptions and sensitivity analysis, concluding by summarizing the steps to complete the case study.
Project Management Journal, Dec 1, 2015
While system complexity is on the rise across many product lines, the resources required to succe... more While system complexity is on the rise across many product lines, the resources required to successfully design and implement complex systems remain constrained. Because financiers of complex systems development efforts actively monitor project implementation cost, project performance models are needed to help project managers predict their cost compliance and avoid cost overruns. This article describes recent research conducted by the authors to develop a cost overrun predictive model using five known drivers of complex systems development cost. The study identifies schedule and reliability, as the key determinants of whether or not a large complex systems development project will experience cost overrun.
IOS Press eBooks, Oct 31, 2022
Collaborative systems-of-systems (CSoSs) are defined by the Systems Engineering Body of Knowledge... more Collaborative systems-of-systems (CSoSs) are defined by the Systems Engineering Body of Knowledge as groups of constituent systems that voluntarily work with each other toward a common goal. As the complexity, sociotechnical interactions, and cooperation of real systems increases, so too does our need to understand how to design and manage collaboration across disciplines. An agentbased model is developed that combines network evolution mechanisms with evolutionary game theory to simulate CSoSs. Collaboration efficiency (CE) is introduced as a metric by which collaboration may be measured and performance compared. Cost and strategy parameters of constituent systems are tested via CSoS model simulation to develop insights into best collaboration practices for CSoSs. Results suggest a reactive collaboration strategy or a reinforcement algorithm-based strategy produce the highest CE under certain conditions. Applicable to systems in sociotechnical enterprises, logistics, energy, infrastructure, and more, this research can improve the design and operation of any CSoS.
Defense acquisition research journal, Jul 1, 2021
Since at least 1990, designing and operating this supply chain, and adapting it to ever-changing ... more Since at least 1990, designing and operating this supply chain, and adapting it to ever-changing military requirements are highly complex and tightly coupled problems, which the highest levels of DoD recognize as weaknesses. Military supply chains face a wide range of challenges. Decisions made at the operational and tactical levels of logistics can alter the effectiveness of decisions made at the strategic level. Decisions must be made with incomplete information. As a result, practical solutions must simultaneously incorporate decisions made at all levels as well as take into account the uncertainty faced by the logistician. The design of modern military supply chains, particularly for large networks where many values are not known precisely, is recognized as too complex for many techniques found in the academic literature. Much of A SIMULATION-BASED Optimization Approach to LOGISTIC AND SUPPLY CHAIN Network Design the literature in supply chain network design makes simplifying assumptions, such as constant per-unit transportation costs regardless of the size of the shipment, the shipping mode selected, the time available for the delivery, or the route taken. This article avoids these assumptions to provide an approach the practitioner can use when designing and adapting supply chain networks. This research proposes a simulation-based optimization approach to find a near-optimal solution to a large supply chain network design problem of the scale faced by a theater commander, while recognizing the complexity and uncertainty that the practicing military logistician must confront.
Uploads
Papers by Thomas Mazzuchi